Optimal kernel regression bounds under energy-bounded noise

📅 2025-05-28
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Quantifying uncertainty for kernel regression estimators in safety-critical applications remains challenging due to the lack of tight, non-asymptotic, and non-conservative uncertainty bounds. Method: We propose a novel framework grounded in robust optimization and non-asymptotic statistical analysis, supporting energy-bounded (norm-constrained) and correlated noise modeling. Crucially, we analytically characterize the worst-case function realization under kernel regression and prove its equivalence to the posterior mean and covariance of a Gaussian process with a specific noise covariance structure. Contribution/Results: This equivalence unifies bound tightness and computational tractability—enabling exact, closed-form uncertainty quantification without approximation. The resulting bounds are provably optimal within the specified noise class and computationally efficient. Experiments across multiple benchmarks demonstrate substantial improvements over state-of-the-art methods, confirming both theoretical optimality and practical feasibility.

Technology Category

Application Category

📝 Abstract
Non-conservative uncertainty bounds are key for both assessing an estimation algorithm's accuracy and in view of downstream tasks, such as its deployment in safety-critical contexts. In this paper, we derive a tight, non-asymptotic uncertainty bound for kernel-based estimation, which can also handle correlated noise sequences. Its computation relies on a mild norm-boundedness assumption on the unknown function and the noise, returning the worst-case function realization within the hypothesis class at an arbitrary query input location. The value of this function is shown to be given in terms of the posterior mean and covariance of a Gaussian process for an optimal choice of the measurement noise covariance. By rigorously analyzing the proposed approach and comparing it with other results in the literature, we show its effectiveness in returning tight and easy-to-compute bounds for kernel-based estimates.
Problem

Research questions and friction points this paper is trying to address.

Derive tight uncertainty bounds for kernel-based estimation
Handle correlated noise sequences in regression analysis
Provide worst-case function realization within hypothesis class
Innovation

Methods, ideas, or system contributions that make the work stand out.

Derives tight non-asymptotic kernel regression bounds
Handles correlated noise via Gaussian process analysis
Computes worst-case function within norm-bounded class
🔎 Similar Papers
No similar papers found.