Worst-case to average-case reductions revisited

Dan Gutfreund, Amnon Ta-Shma

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

A fundamental goal of computational complexity (and foundations of cryptography) is to find a polynomial-time samplable distribution (e.g., the uniform distribution) and a language in NTIME(f(n)) for some polynomial function f, such that the language is hard on the average with respect to this distribution, given that NP is worst-case hard (i.e. NP ≠ P, or NP ⊈ BPP). Currently, no such result is known even if we relax the language to be in nondeterministic sub-exponential time. There has been a long line of research trying to explain our failure in proving such worst-case/average-case connections [FF93, Vio03, BT03, AGGM06]. The bottom line of this research is essentially that (under plausible assumptions) non-adaptive Turing reductions cannot prove such results. In this paper we revisit the problem. Our first observation is that the above mentioned negative arguments extend to a non-standard notion of average-case complexity, in which the distribution on the inputs with respect to which we measure the average-case complexity of the language, is only samplable in super-polynomial time. The significance of this result stems from the fact that in this non-standard setting, [GSTS05] did show a worst-case/average-case connection. In other words, their techniques give a way to bypass the impossibility arguments. By taking a closer look at the proof of [GSTS05], we discover that the worst-case/averagecase connection is proven by a reduction that "almost" falls under the category ruled out by the negative result. This gives rise to an intriguing new notion of (almost black-box) reductions. After extending the negative results to the non-standard average-case setting of [GSTS05], we ask whether their positive result can be extended to the standard setting, to prove some new worst-case/average-case connections. While we can not do that unconditionally, we are able to show that under a mild derandomization assumption, the worst-case hardness of NP implies the average-case hardness of NTIME(f(n)) (under the uniform distribution) where f is computable in quasi-polynomial time.

Original languageEnglish
Title of host publicationApproximation, Randomization, and Combinatorial Optimization
Subtitle of host publicationAlgorithms and Techniques - 10th International Workshop, APPROX 2007 and 11th International Workshop, RANDOM 2007, Proceedings
PublisherSpringer Verlag
Pages569-583
Number of pages15
ISBN (Print)9783540742074
DOIs
StatePublished - 2007
Event10th International Workshop on Approximation Algorithms for Combinatorial Optimization Problems, APPROX 2007 and 11th International Workshop on Randomization and Computation, RANDOM 2007 - Princeton, NJ, United States
Duration: 20 Aug 200722 Aug 2007

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume4627 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference10th International Workshop on Approximation Algorithms for Combinatorial Optimization Problems, APPROX 2007 and 11th International Workshop on Randomization and Computation, RANDOM 2007
Country/TerritoryUnited States
CityPrinceton, NJ
Period20/08/0722/08/07

Fingerprint

Dive into the research topics of 'Worst-case to average-case reductions revisited'. Together they form a unique fingerprint.

Cite this