🤖 AI Summary
This work investigates the interpolation capability of deep ReLU networks on $N$ irregularly sampled points within the unit ball, where the minimal separation distance is $delta = exp(-Theta(N))$. Addressing parameter efficiency—the central challenge—we establish the first tight lower bound on the number of trainable parameters required for exact interpolation under irregular sampling: $Omega(N)$. We prove this bound is both necessary and unimprovable, complementing the known sufficiency of $O(N)$ parameters. Our analysis combines constructive piecewise-linear function design, VC-dimension-based combinatorial arguments, and Sobolev embedding theory. Crucially, we refute the applicability of bit-extraction techniques in this irregular setting and derive, for the first time, the optimal approximation error lower bound $Omega(n^{-s/d})$ for functions in the critical smoothness regime of the Sobolev space $W^{s,p}$. The results jointly characterize the parametric complexity of irregular interpolation and the fundamental limits of function approximation in Sobolev spaces.
📝 Abstract
We study the interpolation power of deep ReLU neural networks. Specifically, we consider the question of how efficiently, in terms of the number of parameters, deep ReLU networks can interpolate values at $N$ datapoints in the unit ball which are separated by a distance $delta$. We show that $Omega(N)$ parameters are required in the regime where $delta$ is exponentially small in $N$, which gives the sharp result in this regime since $O(N)$ parameters are always sufficient. This also shows that the bit-extraction technique used to prove lower bounds on the VC dimension cannot be applied to irregularly spaced datapoints. Finally, as an application we give a lower bound on the approximation rates that deep ReLU neural networks can achieve for Sobolev spaces at the embedding endpoint.