[PDF][PDF] Weakly learning DNF and characterizing statistical query learning using Fourier analysis
Proceedings of the twenty-sixth annual ACM symposium on Theory of computing, 1994•dl.acm.org
We present new results, both positive and negative, on the well-studied problem of learning
disjunctive normal form (DNF) expressions. We first prove that an algorithm due to
Kushilevitz and Mansour[16] can be used to weakly learn DNF using membership queries in
polynomial time, with respect to the uniform distribution on the inputs. This is the first positive
result for learning unrestricted DNF expressions in polynomial time in any nontrivial formal
model of learning. It provides a sharp contrast with the results of Kharitonov[15], who proved …
disjunctive normal form (DNF) expressions. We first prove that an algorithm due to
Kushilevitz and Mansour[16] can be used to weakly learn DNF using membership queries in
polynomial time, with respect to the uniform distribution on the inputs. This is the first positive
result for learning unrestricted DNF expressions in polynomial time in any nontrivial formal
model of learning. It provides a sharp contrast with the results of Kharitonov[15], who proved …
Abstract
We present new results, both positive and negative, on the well-studied problem of learning disjunctive normal form (DNF) expressions.
We first prove that an algorithm due to Kushilevitz and Mansour[16] can be used to weakly learn DNF using membership queries in polynomial time, with respect to the uniform distribution on the inputs. This is the first positive result for learning unrestricted DNF expressions in polynomial time in any nontrivial formal model of learning. It provides a sharp contrast with the results of Kharitonov[15], who proved that ACO is not efficiently learnable in the same model (given certain plausible cryptographic assumptions). We also present efficient learning algorithms in various models for the read-k and SAT-k subclasses of DNF. For our negative results, we turn our attention to the recently introduced statistical query model of learning[11]. This model is a restricted version of the popular Probably Approximately Correct(PAC) model[23], and practically every class known to be efficiently learnable in the PAC model is in fact learnable in the statistical query model [11]. Here we give a general characterization of the complexity of statistical query learning in terms of the number of uncorrelated functions in the concept class. This is a distributiondependent quantity yielding upper and lower bounds on the number of statistical queries required for learning on any input distribution. As a corollary, we obtain that DNF expressions and decision trees are not even weakly learnable with
ACM Digital Library