Borsuk-Ulam and Replicable Learning of Large-Margin Halfspaces

📅 2025-03-19
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work investigates the relationship between list learnability—particularly for large-margin halfspaces—and Littlestone dimension, differentially private learnability, and shared-randomness replicability. We establish a fundamental separation: for high-dimensional γ-margin halfspaces, list learnability does not follow from bounded Littlestone dimension—a first such result. Our approach combines novel topological lower bounds via the local Borsuk–Ulam theorem, triangulations of cross-polytopes, and SVM generalization analysis. Key contributions are: (1) tight bounds on the list learnability number of d-dimensional γ-margin halfspaces—precisely between ⌊d/2⌋+1 and d; (2) a proof that any total concept disambiguation of infinite-dimensional large-margin halfspaces necessarily yields unbounded Littlestone dimension; and (3) a complete characterization showing that the maximum list learnability number of homogeneous halfspaces over finite point sets in ℝᵈ equals d, thereby resolving several open problems posed by Alon, Chase, Fang et al.

Technology Category

Application Category

📝 Abstract
Recent advances in learning theory have established that, for total concepts, list replicability, global stability, differentially private (DP) learnability, and shared-randomness replicability coincide precisely with the finiteness of the Littlestone dimension. Does the same hold for partial concept classes? We answer this question by studying the large-margin half-spaces class, which has bounded Littlestone dimension and is purely DP-learnable and shared-randomness replicable even in high dimensions. We prove that the list replicability number of $gamma$-margin half-spaces satisfies [ frac{d}{2} + 1 le mathrm{LR}(H_{gamma}^d) le d, ] which increases with the dimension $d$. This reveals a surprising separation for partial concepts: list replicability and global stability do not follow from bounded Littlestone dimension, DP-learnability, or shared-randomness replicability. By applying our main theorem, we also answer the following open problems. - We prove that any disambiguation of an infinite-dimensional large-margin half-space to a total concept class has unbounded Littlestone dimension, answering an open question of Alon et al. (FOCS '21). - We prove that the maximum list-replicability number of any *finite* set of points and homogeneous half-spaces in $d$-dimensional Euclidean space is $d$, resolving a problem of Chase et al. (FOCS '23). - We prove that any disambiguation of the Gap Hamming Distance problem in the large gap regime has unbounded public-coin randomized communication complexity. This answers an open problem of Fang et al. (STOC '25). We prove the lower bound via a topological argument involving the local Borsuk-Ulam theorem of Chase et al. (STOC '24). For the upper bound, we design a learning rule that relies on certain triangulations of the cross-polytope and recent results on the generalization properties of SVM.
Problem

Research questions and friction points this paper is trying to address.

Investigates list replicability in partial concept classes with bounded Littlestone dimension.
Resolves open problems on disambiguation and list-replicability in large-margin half-spaces.
Uses topological methods to prove lower bounds on list-replicability numbers.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses Borsuk-Ulam theorem for topological arguments
Applies triangulations of cross-polytope in learning
Leverages SVM generalization properties for bounds
🔎 Similar Papers
No similar papers found.
A
Ari Blondal
McGill University
H
Hamed Hatami
McGill University
Pooya Hatami
Pooya Hatami
The Ohio State University
Theoretical Computer Science
C
Chavdar Lalov
Ohio State University
S
Sivan Tretiak
Ohio State University