🤖 AI Summary
This work addresses the insufficiency of existing scale-sensitive complexity measures in characterizing function classes, particularly their inadequate modeling of scale sensitivity. We introduce the *gap-scale-sensitive dimension*, a novel complexity measure grounded in functional analysis, combinatorial arguments, and probabilistic tools. We systematically establish its theoretical properties under both sequential and non-sequential learning settings. We prove that this dimension tightly upper-bounds the covering numbers of uniformly bounded function classes and—crucially—derive, for the first time, nontrivial lower bounds on offset Rademacher complexities. Compared to classical scale-sensitive dimensions (e.g., fat-shattering dimension), our framework is more expressive and unified, substantially strengthening the derivation of convergence rate lower bounds in both statistical and online learning. It provides a sharper, more broadly applicable tool for complexity-driven lower-bound analysis.
📝 Abstract
We study gapped scale-sensitive dimensions of a function class in both sequential and non-sequential settings. We demonstrate that covering numbers for any uniformly bounded class are controlled above by these gapped dimensions, generalizing the results of cite{anthony2000function,alon1997scale}. Moreover, we show that the gapped dimensions lead to lower bounds on offset Rademacher averages, thereby strengthening existing approaches for proving lower bounds on rates of convergence in statistical and online learning.