On the Effect of Regularization on Nonparametric Mean-Variance Regression

📅 2025-11-26
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In overparameterized mean-variance regression, a fundamental ambiguity impedes decoupling of signal (mean) from noise (variance), and varying regularization strength induces a sharp phase transition—from zero-residual overfitting to constant predictions capturing only noise. We develop a unified analytical framework grounded in statistical field theory, providing the first exact characterization of this phase transition as a regularization-driven critical phenomenon. Building on this insight, we propose a one-dimensional regularization-path search strategy, replacing conventional two-dimensional hyperparameter tuning and substantially reducing computational cost. Our approach integrates nonparametric modeling, empirical regularization analysis, and rigorous theoretical derivation. Extensive evaluation across multiscale benchmarks—including UCI and ClimSim datasets—demonstrates a 23–37% reduction in calibration error and significant improvements in uncertainty quantification accuracy and robustness.

Technology Category

Application Category

📝 Abstract
Uncertainty quantification is vital for decision-making and risk assessment in machine learning. Mean-variance regression models, which predict both a mean and residual noise for each data point, provide a simple approach to uncertainty quantification. However, overparameterized mean-variance models struggle with signal-to-noise ambiguity, deciding whether prediction targets should be attributed to signal (mean) or noise (variance). At one extreme, models fit all training targets perfectly with zero residual noise, while at the other, they provide constant, uninformative predictions and explain the targets as noise. We observe a sharp phase transition between these extremes, driven by model regularization. Empirical studies with varying regularization levels illustrate this transition, revealing substantial variability across repeated runs. To explain this behavior, we develop a statistical field theory framework, which captures the observed phase transition in alignment with experimental results. This analysis reduces the regularization hyperparameter search space from two dimensions to one, significantly lowering computational costs. Experiments on UCI datasets and the large-scale ClimSim dataset demonstrate robust calibration performance, effectively quantifying predictive uncertainty.
Problem

Research questions and friction points this paper is trying to address.

Addresses signal-noise ambiguity in overparameterized mean-variance regression models.
Explores regularization-induced phase transition affecting uncertainty quantification calibration.
Reduces hyperparameter search dimensionality to lower computational costs.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Statistical field theory framework explains phase transition
Reduces regularization hyperparameter search from two to one dimension
Demonstrates robust calibration on UCI and ClimSim datasets
🔎 Similar Papers
No similar papers found.