π€ AI Summary
This work addresses nonparametric regression in Besov spaces under sub-exponential noise by proposing an online learning algorithm based on wavelet expansions. The method employs adaptive gradient clipping that obviates the need to pre-specify bounds on the noise variance or gradients. By integrating a refined online-to-batch conversion with self-normalized concentration inequalities, the approach achieves, for the first time within the batch statistical framework, minimax optimal estimation with high probability. The resulting estimator automatically adapts to both the unknown noise level and the ββ norm of the comparator, attaining an adaptive and optimal integrated mean squared error convergence rate.
π Abstract
We study nonparametric regression over Besov spaces from noisy observations under sub-exponential noise, aiming to achieve minimax-optimal guarantees on the integrated squared error that hold with high probability and adapt to the unknown noise level. To this end, we propose a wavelet-based online learning algorithm that dynamically adjusts to the observed gradient noise by adaptively clipping it at an appropriate level, eliminating the need to tune parameters such as the noise variance or gradient bounds. As a by-product of our analysis, we derive high-probability adaptive regret bounds that scale with the $\ell_1$-norm of the competitor. Finally, in the batch statistical setting, we obtain adaptive and minimax-optimal estimation rates for Besov spaces via a refined online-to-batch conversion. This approach carefully exploits the structure of the squared loss in combination with self-normalized concentration inequalities.