DNF Learning via Locally Mixing Random Walks

📅 2025-05-24
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work studies distribution-independent PAC learning of DNF formulas via membership queries. We propose the first algorithmic framework for this problem under arbitrary unknown input distributions, based on local-mixing random walks. Our key contributions are threefold: (1) introducing local-mixing random walk theory to DNF learning for the first time; (2) designing a quasipolynomial-time (quasipoly(n,s)) list-decoding algorithm for single-term conjunctions; and (3) leveraging this to establish efficient distribution-independent learnability of width-uniform s-term DNFs. Unlike prior approaches requiring restrictive distributional assumptions—e.g., uniformity—our method imposes no distributional constraints, thereby strengthening both theoretical guarantees and practical applicability. This work establishes a new paradigm for distribution-independent learning of Boolean functions.

Technology Category

Application Category

📝 Abstract
We give two results on PAC learning DNF formulas using membership queries in the challenging"distribution-free"learning framework, where learning algorithms must succeed for an arbitrary and unknown distribution over ${0,1}^n$. (1) We first give a quasi-polynomial time"list-decoding"algorithm for learning a single term of an unknown DNF formula. More precisely, for any target $s$-term DNF formula $f = T_1 vee cdots vee T_s$ over ${0,1}^n$ and any unknown distribution $D$ over ${0,1}^n$, our algorithm, which uses membership queries and random examples from $D$, runs in $ extsf{quasipoly}(n,s)$ time and outputs a list $L$ of candidate terms such that with high probability some term $T_i$ of $f$ belongs to $L$. (2) We then use result (1) to give a $ extsf{quasipoly}(n,s)$-time algorithm, in the distribution-free PAC learning model with membership queries, for learning the class of size-$s$ DNFs in which all terms have the same size. Our algorithm learns using a DNF hypothesis. The key tool used to establish result (1) is a new result on"locally mixing random walks,"which, roughly speaking, shows that a random walk on a graph that is covered by a small number of expanders has a non-negligible probability of mixing quickly in a subset of these expanders.
Problem

Research questions and friction points this paper is trying to address.

PAC learning DNF formulas under unknown distributions
Quasi-polynomial time list-decoding for single DNF terms
Learning uniform-size DNFs using membership queries
Innovation

Methods, ideas, or system contributions that make the work stand out.

Quasi-polynomial time list-decoding algorithm
Distribution-free PAC learning with membership queries
Locally mixing random walks for term learning
🔎 Similar Papers
No similar papers found.