Faster exact learning of k-term DNFs with membership and equivalence queries

📅 2025-07-27
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses the exact learning of $k$-term DNF formulas, aiming to break the $mathrm{poly}(n, 2^k)$ time barrier established by Blum and Rudich (1992). We propose a novel framework integrating dynamic feature-space augmentation with adaptive query strategies, unifying techniques from Winnow2, attribute-efficient learning, learning low-weight linear threshold functions, and junta variable identification. Our theoretical analysis leverages extremal polynomials and noise operators to control approximation error and query complexity. As our main contribution, we achieve the first subexponential-in-$k$ exact learning algorithm for $k$-term DNF, running in $mathrm{poly}(n) cdot 2^{ ilde{O}(sqrt{k})}$ time under the membership and equivalence query model. This constitutes the first substantial speedup over prior bounds and introduces a new paradigm—along with key analytical tools—for learning Boolean functions.

Technology Category

Application Category

📝 Abstract
In 1992 Blum and Rudich [BR92] gave an algorithm that uses membership and equivalence queries to learn $k$-term DNF formulas over ${0,1}^n$ in time $ extsf{poly}(n,2^k)$, improving on the naive $O(n^k)$ running time that can be achieved without membership queries [Val84]. Since then, many alternative algorithms [Bsh95, Kus97, Bsh97, BBB+00] have been given which also achieve runtime $ extsf{poly}(n,2^k)$. We give an algorithm that uses membership and equivalence queries to learn $k$-term DNF formulas in time $ extsf{poly}(n) cdot 2^{ ilde{O}(sqrt{k})}$. This is the first improvement for this problem since the original work of Blum and Rudich [BR92]. Our approach employs the Winnow2 algorithm for learning linear threshold functions over an enhanced feature space which is adaptively constructed using membership queries. It combines a strengthened version of a technique that effectively reduces the length of DNF terms from the original work of [BR92] with a range of additional algorithmic tools (attribute-efficient learning algorithms for low-weight linear threshold functions and techniques for finding relevant variables from junta testing) and analytic ingredients (extremal polynomials and noise operators) that are novel in the context of query-based DNF learning.
Problem

Research questions and friction points this paper is trying to address.

Improving runtime for learning k-term DNF formulas
Using membership and equivalence queries efficiently
Combining novel algorithmic tools and analytic techniques
Innovation

Methods, ideas, or system contributions that make the work stand out.

Enhanced feature space with Winnow2 algorithm
Strengthened DNF term reduction technique
Novel junta testing and noise operators
🔎 Similar Papers
No similar papers found.