Autotune: fast, accurate, and automatic tuning parameter selection for LASSO

πŸ“… 2025-12-11
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
To address the inefficiency and inaccuracy of LASSO hyperparameter tuning in high-dimensional time-series regression (e.g., VAR models), this paper proposes Autotuneβ€”a method that jointly estimates regression coefficients and noise standard deviation by alternatingly maximizing a penalized Gaussian log-likelihood. Key contributions include: (i) the first noise standard deviation estimator with provable high-dimensional consistency; (ii) a likelihood-path-based visualization tool for sparse structure diagnosis; and (iii) fully automated, cross-validation-free LASSO tuning. Implemented in C++ for computational efficiency and packaged as an R library, Autotune demonstrates substantial improvements in empirical evaluation: it achieves significant speedups over state-of-the-art methods, reduces generalization error by 15–30%, improves model selection F1-score by over 20%, and validates practical utility in financial time-series forecasting.

Technology Category

Application Category

πŸ“ Abstract
Least absolute shrinkage and selection operator (Lasso), a popular method for high-dimensional regression, is now used widely for estimating high-dimensional time series models such as the vector autoregression (VAR). Selecting its tuning parameter efficiently and accurately remains a challenge, despite the abundance of available methods for doing so. We propose $mathsf{autotune}$, a strategy for Lasso to automatically tune itself by optimizing a penalized Gaussian log-likelihood alternately over regression coefficients and noise standard deviation. Using extensive simulation experiments on regression and VAR models, we show that $mathsf{autotune}$ is faster, and provides better generalization and model selection than established alternatives in low signal-to-noise regimes. In the process, $mathsf{autotune}$ provides a new estimator of noise standard deviation that can be used for high-dimensional inference, and a new visual diagnostic procedure for checking the sparsity assumption on regression coefficients. Finally, we demonstrate the utility of $mathsf{autotune}$ on a real-world financial data set. An R package based on C++ is also made publicly available on Github.
Problem

Research questions and friction points this paper is trying to address.

Automatically selects tuning parameters for LASSO regression
Improves generalization and model selection in low signal-to-noise
Provides new noise estimator and diagnostic for sparsity assumption
Innovation

Methods, ideas, or system contributions that make the work stand out.

Autotune automatically optimizes penalized Gaussian log-likelihood for Lasso.
It provides a new noise standard deviation estimator for high-dimensional inference.
It includes a visual diagnostic to check sparsity assumptions on coefficients.
πŸ”Ž Similar Papers
No similar papers found.