Local Urysohn Width: A Topological Complexity Measure for Classification

📅 2026-03-16
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work proposes local Urysohn width as an intrinsic measure of the topological-geometric complexity of classification problems, characterizing the minimal number of diameter-constrained local experts required to guarantee correct classification. By integrating tools from algebraic topology (e.g., Betti numbers), metric geometry, and statistical learning theory, the paper establishes a rigorous hierarchy theorem and a topological-geometric scaling law, and demonstrates that this measure is bidirectionally separated from the VC dimension. The central contribution lies in revealing the fundamental constraint imposed by Urysohn width on classifier complexity and deriving a sample complexity lower bound of Ω(w log w) that is independent of the VC dimension.

Technology Category

Application Category

📝 Abstract
We introduce \emph{local Urysohn width}, a complexity measure for classification problems on metric spaces. Unlike VC dimension, fat-shattering dimension, and Rademacher complexity, which characterize the richness of hypothesis \emph{classes}, Urysohn width characterizes the topological-geometric complexity of the classification \emph{problem itself}: the minimum number of connected, diameter-bounded local experts needed to correctly classify all points within a margin-safe region. We prove four main results. First, a \textbf{strict hierarchy theorem}: for every integer $w \geq 1$, there exists a classification problem on a \emph{connected} compact metric space (a bouquet of circles with first Betti number $β_1 = w$) whose Urysohn width is exactly~$w$, establishing that topological complexity of the input space forces classifier complexity. Second, a \textbf{topology $\times$ geometry scaling law}: width scales as $Ω(w \cdot L/D_0)$, where $w$ counts independent loops and $L/D_0$ is the ratio of loop circumference to locality scale. Third, a \textbf{two-way separation from VC dimension}: there exist problem families where width grows unboundedly while VC dimension is bounded by a constant, and conversely, families where VC dimension grows unboundedly while width remains~1. Fourth, a \textbf{sample complexity lower bound}: any learner that must correctly classify all points in the safe region of a width-$w$ problem needs $Ω(w \log w)$ samples, independent of VC dimension.
Problem

Research questions and friction points this paper is trying to address.

Urysohn width
topological complexity
classification
metric space
sample complexity
Innovation

Methods, ideas, or system contributions that make the work stand out.

local Urysohn width
topological complexity
classification
VC dimension
sample complexity
🔎 Similar Papers
No similar papers found.