Strong screening rules for group-based SLOPE models

📅 2024-05-24
🏛️ arXiv.org
📈 Citations: 1
Influential: 0
📄 PDF
🤖 AI Summary
Hyperparameter tuning for Group SLOPE and Sparse Group SLOPE is computationally expensive due to the need to traverse the full regularization path. Method: This paper introduces, for the first time, theoretically guaranteed strong screening rules for both models—extended to the broader class of generalized group OWL penalties (e.g., OSCAR). The rules leverage KKT condition analysis, duality gap estimation, and group-structured optimization to safely discard irrelevant variables prior to model fitting, drastically reducing input dimensionality. Contribution/Results: Experiments on synthetic and real genomic datasets demonstrate that the proposed rules accelerate training by several-fold on high-dimensional genetic data (e.g., p ≈ 10⁴–10⁵) with zero false negatives. This enables, for the first time, efficient and scalable application of Group SLOPE in ultra-high-dimensional settings—previously infeasible due to prohibitive computational cost.

Technology Category

Application Category

📝 Abstract
Tuning the regularization parameter in penalized regression models is an expensive task, requiring multiple models to be fit along a path of parameters. Strong screening rules drastically reduce computational costs by lowering the dimensionality of the input prior to fitting. We develop strong screening rules for group-based Sorted L-One Penalized Estimation (SLOPE) models: Group SLOPE and Sparse-group SLOPE. The developed rules are applicable to the wider family of group-based OWL models, including OSCAR. Our experiments on both synthetic and real data show that the screening rules significantly accelerate the fitting process. The screening rules make it accessible for group SLOPE and sparse-group SLOPE to be applied to high-dimensional datasets, particularly those encountered in genetics.
Problem

Research questions and friction points this paper is trying to address.

Reducing computational costs in penalized regression models
Developing strong screening rules for group-based SLOPE models
Accelerating model fitting for high-dimensional genetics datasets
Innovation

Methods, ideas, or system contributions that make the work stand out.

Strong screening rules reduce input dimensionality
Rules apply to group-based OWL family models
Significantly accelerates high-dimensional data fitting
🔎 Similar Papers
No similar papers found.
F
Fabio Feser
Department of Mathematics, Imperial College London
Marina Evangelou
Marina Evangelou
Imperial College London
StatisticsMachine Learning