Bayes and Biased Estimators Without Hyper-parameter Estimation: Comparable Performance to the Empirical-Bayes-Based Regularized Estimator

📅 2025-03-14
📈 Citations: 1
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses the computational burden of hyperparameter optimization in regularized estimation for system identification. To this end, we propose two novel parameter-free estimators: a generalized Bayesian estimator and a closed-form biased estimator. Theoretically, we construct, for the first time, a family of hyperparameter-free estimators whose excess mean squared error (MSE) strictly matches that of the optimal empirical Bayes regularized estimator. Methodologically, leveraging large-sample asymptotics, ridge regression modeling, and bias–variance trade-off analysis, we derive analytical closed-form solutions—bypassing iterative optimization entirely. Numerical experiments demonstrate that the proposed estimators achieve MSE performance comparable to the best empirical Bayes method, while reducing computational time significantly. This makes them particularly suitable for real-time applications and resource-constrained environments.

Technology Category

Application Category

📝 Abstract
Regularized system identification has become a significant complement to more classical system identification. It has been numerically shown that kernel-based regularized estimators often perform better than the maximum likelihood estimator in terms of minimizing mean squared error (MSE). However, regularized estimators often require hyper-parameter estimation. This paper focuses on ridge regression and the regularized estimator by employing the empirical Bayes hyper-parameter estimator. We utilize the excess MSE to quantify the MSE difference between the empirical-Bayes-based regularized estimator and the maximum likelihood estimator for large sample sizes. We then exploit the excess MSE expressions to develop both a family of generalized Bayes estimators and a family of closed-form biased estimators. They have the same excess MSE as the empirical-Bayes-based regularized estimator but eliminate the need for hyper-parameter estimation. Moreover, we conduct numerical simulations to show that the performance of these new estimators is comparable to the empirical-Bayes-based regularized estimator, while computationally, they are more efficient.
Problem

Research questions and friction points this paper is trying to address.

Develops estimators without hyper-parameter estimation.
Compares performance to empirical-Bayes-based regularized estimators.
Focuses on ridge regression and MSE minimization.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Develops generalized Bayes estimators without hyper-parameters.
Introduces closed-form biased estimators for efficiency.
Compares performance to empirical-Bayes-based regularized estimators.
🔎 Similar Papers
No similar papers found.
Y
Yue Ju
Division of Decision and Control Systems, School of Electrical Engineering and Computer Science, KTH Royal Institute of Technology, 10044 Stockholm, Sweden
Bo Wahlberg
Bo Wahlberg
Professor Automatic Control, KTH Royal Institute of Technology
control systemssystem identificationsignal processing
H
Hrakan Hjalmarsson
Division of Decision and Control Systems, School of Electrical Engineering and Computer Science, KTH Royal Institute of Technology, 10044 Stockholm, Sweden