🤖 AI Summary
Conventional uniform bounds on the non-asymptotic relative deviation of eigenvalues of empirical covariance and Gram matrices fail for small eigenvalues.
Method: We develop a general theorem that automatically upgrades existing uniform bounds to sharp, spectrum-wide relative deviation bounds—including vanishing eigenvalues—via a constructive proof framework integrating matrix perturbation theory, random matrix theory, and concentration inequalities.
Contribution/Results: The resulting bounds are distribution-free, apply to arbitrary data distributions, and significantly improve upon classical results in high-dimensional, low-sample regimes. Crucially, they provide the first tight, relative-error control over the entire eigenvalue spectrum—including eigenvalues approaching zero—thereby enabling precise spectral analysis even in ill-conditioned or rank-deficient settings. The framework is concise, broadly applicable, and readily transferable to related problems in statistical learning and high-dimensional inference.
📝 Abstract
We provide non-asymptotic, relative deviation bounds for the eigenvalues of empirical covariance and Gram matrices in general settings. Unlike typical uniform bounds, which may fail to capture the behavior of smaller eigenvalues, our results provide sharper control across the spectrum. Our analysis is based on a general-purpose theorem that allows one to convert existing uniform bounds into relative ones. The theorems and techniques emphasize simplicity and should be applicable across various settings.