🤖 AI Summary
This paper investigates generalization consistency of minimum-norm interpolation in bounded-kernel reproducing kernel Hilbert spaces (RKHS), focusing on generalization error measured under a continuous scale of Sobolev-type norms—interpolating between the $L^2$ and RKHS norms. Using spectral analysis, Sobolev embedding theorems, and precise characterization of kernel eigenvalue decay rates, the authors derive the first tight lower bound on the generalization error within this norm scale. They rigorously prove that when the Sobolev smoothness index exceeds a threshold determined solely by the RKHS embedding exponent and the kernel’s eigenvalue decay rate, the interpolation solution is necessarily inconsistent. This result systematically reveals an intrinsic inconsistency of kernel interpolation across the continuous norm spectrum, providing a novel theoretical perspective and a rigorous criterion for understanding the generalization limits of overparameterized interpolation models.
📝 Abstract
We study the consistency of minimum-norm interpolation in reproducing kernel Hilbert spaces corresponding to bounded kernels. Our main result give lower bounds for the generalization error of the kernel interpolation measured in a continuous scale of norms that interpolate between $L^2$ and the hypothesis space. These lower bounds imply that kernel interpolation is always inconsistent, when the smoothness index of the norm is larger than a constant that depends only on the embedding index of the hypothesis space and the decay rate of the eigenvalues.