Optimal Cross-Validation for Sparse Linear Regression

📅 2023-06-26
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the high computational cost and trade-off between accuracy and efficiency in k-fold cross-validation for high-dimensional sparse linear regression, this paper proposes a computationally efficient cross-validation relaxation framework. The method decouples hyperparameter selection from mixed-integer optimization (MIO) solving—introducing, for the first time, a differentiable validation objective that combines ridge regularization with the minimax concave penalty (MCP). A cyclic coordinate descent algorithm with theoretical convergence guarantees is developed to optimize this relaxed objective. Experiments across 13 real-world datasets demonstrate that, compared to grid search, the proposed approach reduces cross-validation error by 10–30%, decreases the number of MIO solves by 50–80%, and significantly accelerates hyperparameter tuning—while preserving model sparsity and interpretability.
📝 Abstract
Given a high-dimensional covariate matrix and a response vector, ridge-regularized sparse linear regression selects a subset of features that explains the relationship between covariates and the response in an interpretable manner. To select the sparsity and robustness of linear regressors, techniques like k-fold cross-validation are commonly used for hyperparameter tuning. However, cross-validation substantially increases the computational cost of sparse regression as it requires solving many mixed-integer optimization problems (MIOs) for each hyperparameter combination. To improve upon this state of affairs, we obtain computationally tractable relaxations of k-fold cross-validation metrics, facilitating hyperparameter selection after solving 50-80% fewer MIOs in practice. These relaxations result in an efficient cyclic coordinate descent scheme, achieving 10%-30% lower validation errors than via traditional methods such as grid search with MCP or GLMNet across a suite of 13 real-world datasets.
Problem

Research questions and friction points this paper is trying to address.

Improving computational efficiency in sparse linear regression
Reducing MIOs for hyperparameter selection via relaxations
Enhancing validation accuracy compared to traditional methods
Innovation

Methods, ideas, or system contributions that make the work stand out.

Relax k-fold cross-validation metrics for efficiency
Reduce MIOs by 50-80% for hyperparameter selection
Coordinate descent lowers validation errors by 10-30%
🔎 Similar Papers
No similar papers found.
Ryan Cory-Wright
Ryan Cory-Wright
Imperial Business School
Operations ResearchOptimizationMachine LearningAnalyticsElectricity Markets
A
A. G'omez
Department of Industrial and Systems Engineering, Viterbi School of Engineering, University of Southern California, CA