Learning Functions of Halfspaces

📅 2026-03-09
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the problem of efficiently learning arbitrary Boolean functions defined by the intersection of $k$ halfspaces in the distribution-independent PAC model. To this end, the authors propose a novel algorithm that integrates insights from the geometric structure of halfspaces, representations of Boolean functions, high-dimensional dimensionality reduction, and combinatorial analysis. The method achieves, for the first time, a $2^{o(n)}$-time PAC learning algorithm for intersections of two or more halfspaces, reducing the running time to $2^{\sqrt{n} \cdot (\log n)^{O(k)}}$. This result breaks through previous exponential lower bounds and substantially advances the theoretical frontier of learning high-dimensional Boolean functions.

Technology Category

Application Category

📝 Abstract
We give an algorithm that learns arbitrary Boolean functions of $k$ arbitrary halfspaces over $\mathbb{R}^n$, in the challenging distribution-free Probably Approximately Correct (PAC) learning model, running in time $2^{\sqrt{n} \cdot (\log n)^{O(k)}}$. This is the first algorithm that can PAC learn even intersections of two halfspaces in time $2^{o(n)}.$
Problem

Research questions and friction points this paper is trying to address.

PAC learning
halfspaces
Boolean functions
distribution-free
learning theory
Innovation

Methods, ideas, or system contributions that make the work stand out.

PAC learning
halfspaces
distribution-free
Boolean functions
subexponential time
🔎 Similar Papers
No similar papers found.