🤖 AI Summary
Classical VC theory and PAC learning lack a rigorous foundation for high-dimensional learning scenarios involving structured, multi-way dependencies.
Method: This work generalizes VC theory and PAC learning to *n*-fold product spaces under product measures, introducing the notions of higher-order VCₙ-dimension and PACₙ learnability. Leveraging tools from model theory, combinatorics, and probabilistic methods, it establishes the first formal definition of VCₙ-dimension, extends Haussler’s packing lemma, and introduces a *shattered hypergraph regularity lemma*.
Contribution/Results: It provides a complete characterization of PACₙ learnability—namely, a hypothesis class is PACₙ-learnable if and only if its VCₙ-dimension is finite. This unifies and recovers key results from several recent works in higher-order statistical learning, thereby establishing the first systematic theoretical framework for learning over *n*-ary relations and high-order interactions.
📝 Abstract
The aim of this note is to overview some of our work in Chernikov, Towsner'20 (arXiv:2010.00726) developing higher arity VC theory (VC$_n$ dimension), including a generalization of Haussler packing lemma, and an associated tame (slice-wise) hypergraph regularity lemma; and to demonstrate that it characterizes higher arity PAC learning (PAC$_n$ learning) in $n$-fold product spaces with respect to product measures introduced by Kobayashi, Kuriyama and Takeuchi'15. We also point out how some of the recent results in arXiv:2402.14294, arXiv:2505.15688, arXiv:2509.20404 follow from our work in arXiv:2010.00726.