Regularized least squares learning with heavy-tailed noise is minimax optimal

πŸ“… 2025-05-20
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
This paper investigates the learning performance of regularized least squares (i.e., kernel ridge regression) in reproducing kernel Hilbert spaces under heavy-tailed noiseβ€”where only finite higher-order moments exist, violating the sub-exponential assumption. Methodologically, it introduces an integral operator framework combined with a Hilbert-space extension of the Fuk-Nagaev inequality. The analysis yields an excess risk bound comprising a sub-Gaussian leading term and a polynomial remainder term. Crucially, this is the first result establishing sub-Gaussian convergence rates under heavy-tailed noise, rigorously proving the asymptotic robustness of kernel ridge regression. Under standard eigenvalue decay conditions, the method achieves minimax-optimal convergence rates. These advances break the two-decade reliance on sub-exponential noise assumptions in kernel learning theory, substantially broadening both the applicability and theoretical foundation of kernel ridge regression.

Technology Category

Application Category

πŸ“ Abstract
This paper examines the performance of ridge regression in reproducing kernel Hilbert spaces in the presence of noise that exhibits a finite number of higher moments. We establish excess risk bounds consisting of subgaussian and polynomial terms based on the well known integral operator framework. The dominant subgaussian component allows to achieve convergence rates that have previously only been derived under subexponential noise - a prevalent assumption in related work from the last two decades. These rates are optimal under standard eigenvalue decay conditions, demonstrating the asymptotic robustness of regularized least squares against heavy-tailed noise. Our derivations are based on a Fuk-Nagaev inequality for Hilbert-space valued random variables.
Problem

Research questions and friction points this paper is trying to address.

Analyzes ridge regression in RKHS with heavy-tailed noise
Establishes optimal excess risk bounds under finite moments
Demonstrates robustness of least squares to heavy-tailed noise
Innovation

Methods, ideas, or system contributions that make the work stand out.

Ridge regression in RKHS with heavy-tailed noise
Subgaussian and polynomial excess risk bounds
Fuk-Nagaev inequality for Hilbert-space variables
πŸ”Ž Similar Papers
No similar papers found.