Weakly learning DNF and characterizing statistical query learning using fourier analysis

Avrim Blum, Merrick Furst, Jeffrey Jackson, Michael Kearns, Yishay Mansour, Steven Rudich

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review


We present new results, both positive and negative, on the well-studied problem of learning disjunctive normal form (DNF) expressions. We first prove that an algorithm due to Kushilevitz and Mansour [16] can be used to weakly learn DNF using membership queries in polynomial time, with respect to the uniform distribution on the inputs. This is the first positive result for learning unrestricted DNF expressions in polynomial time in any nontrivial formal model of learning. It. pro-vides a sharp contrast with the results of Kharitonov [15], who proved that AC0 is not efficiently leamable in the same model (given certain plausible cryptographic assumptions). We also present efficient learning algorithms in various models for the read-fc and SAT-/: subclasses of DNF. For our negative results, we turn our attention to the recently introduced statistical query model of learning [11). This model is a restricted version of the popular Probably Approximately Correct (PAC) model [23], and practically every class known to be efficiently learnable in the PAC model is in fact learnable in the statistical query model [ll]. Here we give a general characterization of the complexity of statistical query learning in terms of the number of uncorrected functions in the concept class. This is a distribution- dependent quantity yielding upper and lower bounds on the number of statistical queries required for learning on any input distribution. As a corollary, we obtain that DNF expressions and decision trees are not even weakly learnable with respect to the uniform input distribution in polynomial time in the statistical query model. This result is information-Theoretic and therefore does not rely on any unproven assumptions. It demonstrates that no simple modification of the existing algorithms in the computational learning theory literature for learning various restricted forms of DNF and decision trees from passive random examples (arid also several algorithms proposed in the experimental machine learning communities, such as the 1D3 algorithm for decision trees [22] and its variants) will solve the general problem. The unifying tool for all of our results is the Fourier analysis of a finite class of boolean functions on the hypercube.

Original languageEnglish
Title of host publicationProceedings of the 26th Annual ACM Symposium on Theory of Computing, STOC 1994
PublisherAssociation for Computing Machinery
Number of pages10
ISBN (Electronic)0897916638
StatePublished - 23 May 1994
Event26th Annual ACM Symposium on Theory of Computing, STOC 1994 - Montreal, Canada
Duration: 23 May 199425 May 1994

Publication series

NameProceedings of the Annual ACM Symposium on Theory of Computing
VolumePart F129502
ISSN (Print)0737-8017


Conference26th Annual ACM Symposium on Theory of Computing, STOC 1994


Dive into the research topics of 'Weakly learning DNF and characterizing statistical query learning using fourier analysis'. Together they form a unique fingerprint.

Cite this