Samplability makes learning easier

📅 2025-11-30
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work investigates the fundamental distinction between *sampleable* PAC learning—where learners must succeed only on efficiently samplable distributions—and standard PAC learning—which requires uniform correctness across all distributions—and its implications for learning efficiency. We introduce the novel complexity-theoretic primitive of an *explicit evasion set*, and in the random oracle model, construct a concept class that is exponentially hard to learn under standard PAC (requiring exponentially many samples) yet polynomially learnable under sampleable PAC. This yields the first computational separation between the two learning paradigms. Our result demonstrates that distributional samplability can provably reduce sample complexity, thereby expanding the frontier of efficient learnability. Furthermore, we extend this separation to the online learning setting, establishing that the computational power of the adversary fundamentally constrains learnability: restricting the adversary to polynomial-time computation enables efficient learning where computationally unbounded adversaries preclude it.

Technology Category

Application Category

📝 Abstract
The standard definition of PAC learning (Valiant 1984) requires learners to succeed under all distributions -- even ones that are intractable to sample from. This stands in contrast to samplable PAC learning (Blum, Furst, Kearns, and Lipton 1993), where learners only have to succeed under samplable distributions. We study this distinction and show that samplable PAC substantially expands the power of efficient learners. We first construct a concept class that requires exponential sample complexity in standard PAC but is learnable with polynomial sample complexity in samplable PAC. We then lift this statistical separation to the computational setting and obtain a separation relative to a random oracle. Our proofs center around a new complexity primitive, explicit evasive sets, that we introduce and study. These are sets for which membership is easy to determine but are extremely hard to sample from. Our results extend to the online setting to similarly show how its landscape changes when the adversary is assumed to be efficient instead of computationally unbounded.
Problem

Research questions and friction points this paper is trying to address.

Distinguishes standard PAC from samplable PAC learning
Introduces explicit evasive sets for computational separations
Extends findings to online learning with efficient adversaries
Innovation

Methods, ideas, or system contributions that make the work stand out.

Samplable PAC learning reduces sample complexity
Explicit evasive sets enable computational separations
Efficient adversaries alter online learning landscape
🔎 Similar Papers
No similar papers found.