🤖 AI Summary
This work addresses the problem of efficiently learning arbitrary Boolean functions defined by the intersection of $k$ halfspaces in the distribution-independent PAC model. To this end, the authors propose a novel algorithm that integrates insights from the geometric structure of halfspaces, representations of Boolean functions, high-dimensional dimensionality reduction, and combinatorial analysis. The method achieves, for the first time, a $2^{o(n)}$-time PAC learning algorithm for intersections of two or more halfspaces, reducing the running time to $2^{\sqrt{n} \cdot (\log n)^{O(k)}}$. This result breaks through previous exponential lower bounds and substantially advances the theoretical frontier of learning high-dimensional Boolean functions.
📝 Abstract
We give an algorithm that learns arbitrary Boolean functions of $k$ arbitrary halfspaces over $\mathbb{R}^n$, in the challenging distribution-free Probably Approximately Correct (PAC) learning model, running in time $2^{\sqrt{n} \cdot (\log n)^{O(k)}}$. This is the first algorithm that can PAC learn even intersections of two halfspaces in time $2^{o(n)}.$