Optimal Recovery Meets Minimax Estimation

📅 2025-02-24
🏛️ arXiv.org
📈 Citations: 1
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses the optimal approximation of functions in Besov classes under Gaussian noise, measured in the $L_q$-norm. Existing minimax rates fail to precisely capture the dependence on the noise level $sigma$, thereby precluding recovery of the classical noiseless optimal rate as $sigma o 0$. To bridge this theoretical gap, we establish the first noise-level-aware (NLA) sharp minimax convergence rate—providing matching upper and lower bounds and yielding an exact asymptotic characterization of the estimation error in terms of both sample size $m$ and noise variance $sigma^2$. Methodologically, our analysis integrates Besov space theory, information-theoretic lower bound techniques, adaptive truncation, and linear estimation. The resulting rate unifies optimal recovery and minimax estimation frameworks, ensuring a continuous transition to the classical noiseless optimal rate as $sigma o 0$. This provides a unified error benchmark bridging statistical learning and numerical analysis.

Technology Category

Application Category

📝 Abstract
A fundamental problem in statistics and machine learning is to estimate a function $f$ from possibly noisy observations of its point samples. The goal is to design a numerical algorithm to construct an approximation $hat f$ to $f$ in a prescribed norm that asymptotically achieves the best possible error (as a function of the number $m$ of observations and the variance $sigma^2$ of the noise). This problem has received considerable attention in both nonparametric statistics (noisy observations) and optimal recovery (noiseless observations). Quantitative bounds require assumptions on $f$, known as model class assumptions. Classical results assume that $f$ is in the unit ball of a Besov space. In nonparametric statistics, the best possible performance of an algorithm for finding $hat f$ is known as the minimax rate and has been studied in this setting under the assumption that the noise is Gaussian. In optimal recovery, the best possible performance of an algorithm is known as the optimal recovery rate and has also been determined in this setting. While one would expect that the minimax rate recovers the optimal recovery rate when the noise level $sigma$ tends to zero, it turns out that the current results on minimax rates do not carefully determine the dependence on $sigma$ and the limit cannot be taken. This paper handles this issue and determines the noise-level-aware (NLA) minimax rates for Besov classes when error is measured in an $L_q$-norm with matching upper and lower bounds. The end result is a reconciliation between minimax rates and optimal recovery rates. The NLA minimax rate continuously depends on the noise level and recovers the optimal recovery rate when $sigma$ tends to zero.
Problem

Research questions and friction points this paper is trying to address.

Estimate function from noisy samples with optimal error
Reconcile minimax and optimal recovery rates for Besov classes
Determine noise-level-aware minimax rates in Lq-norm
Innovation

Methods, ideas, or system contributions that make the work stand out.

Determines noise-level-aware minimax rates
Reconciles minimax and optimal recovery rates
Uses Besov classes for function estimation
R
Ronald A. DeVore
Department of Mathematics, Texas A&M University
R
Robert D. Nowak
Department of Electrical and Computer Engineering, University of Wisconsin–Madison
Rahul Parhi
Rahul Parhi
University of California, San Diego
applied mathematicssignal processingmachine learningstatisticsoptimization
G
G. Petrova
Department of Mathematics, Texas A&M University
Jonathan W. Siegel
Jonathan W. Siegel
Assistant Professor, Texas A&M University
Approximation TheoryStatisticsMachine Learning