The Cost of Robustness: Tighter Bounds on Parameter Complexity for Robust Memorization in ReLU Nets

📅 2025-10-28
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper investigates the minimal parameter complexity required for ReLU networks to achieve robust memorization: given an ε-separated labeled dataset, the network must output consistent predictions within an ℓ₂-ball of radius μ around each training sample. Method: We conduct a fine-grained analysis of the robustness ratio ρ ∈ (0,1), characterizing how parameter complexity scales with ρ. Using geometric constructions and activation pattern analysis, we derive tight upper and lower bounds on the number of parameters needed. Contribution/Results: Our analysis reveals that for small ρ, robust memorization incurs no additional parameter cost over standard (non-robust) memorization; as ρ increases, the required parameter count grows monotonically. Crucially, we precisely quantify the robustness–complexity trade-off, uncovering a phase-transition behavior and identifying critical threshold phenomena in robust memorization—significantly improving upon prior complexity estimates.

Technology Category

Application Category

📝 Abstract
We study the parameter complexity of robust memorization for $mathrm{ReLU}$ networks: the number of parameters required to interpolate any given dataset with $epsilon$-separation between differently labeled points, while ensuring predictions remain consistent within a $mu$-ball around each training sample. We establish upper and lower bounds on the parameter count as a function of the robustness ratio $ ho = mu / epsilon$. Unlike prior work, we provide a fine-grained analysis across the entire range $ ho in (0,1)$ and obtain tighter upper and lower bounds that improve upon existing results. Our findings reveal that the parameter complexity of robust memorization matches that of non-robust memorization when $ ho$ is small, but grows with increasing $ ho$.
Problem

Research questions and friction points this paper is trying to address.

Analyzing parameter complexity for robust memorization in ReLU networks
Establishing tighter bounds on parameters for epsilon-separated datasets
Investigating how robustness ratio affects parameter count growth
Innovation

Methods, ideas, or system contributions that make the work stand out.

Fine-grained analysis across robustness ratio range
Tighter bounds on parameter complexity for ReLU networks
Parameter count grows with increasing robustness ratio