🤖 AI Summary
This work addresses the Sorted L-One Penalized Estimation (SLOPE) problem in high-dimensional sparse modeling. We propose an efficient hybrid coordinate descent algorithm supporting full regularization path computation, cross-validation, and relaxed SLOPE. The core implementation is in C++, ensuring numerical stability and memory efficiency while natively supporting dense matrices, sparse matrices, and out-of-core storage. The framework extends to generalized linear models and diverse loss functions. It is wrapped with multi-language interfaces for R, Python, and Julia, enabling seamless integration into mainstream scientific computing ecosystems. Experiments demonstrate that our implementation achieves substantial speedups (2–5× on average) and improved memory scalability over existing tools, while maintaining statistical consistency and cross-platform robustness on both synthetic and real-world datasets. To the best of our knowledge, this is the first high-performance, multi-language, and multi-data-format solution for large-scale SLOPE applications.
📝 Abstract
We present a suite of packages in R, Python, Julia, and C++ that efficiently solve the Sorted L-One Penalized Estimation (SLOPE) problem. The packages feature a highly efficient hybrid coordinate descent algorithm that fits generalized linear models (GLMs) and supports a variety of loss functions, including Gaussian, binomial, Poisson, and multinomial logistic regression. Our implementation is designed to be fast, memory-efficient, and flexible. The packages support a variety of data structures (dense, sparse, and out-of-memory matrices) and are designed to efficiently fit the full SLOPE path as well as handle cross-validation of SLOPE models, including the relaxed SLOPE. We present examples of how to use the packages and benchmarks that demonstrate the performance of the packages on both real and simulated data and show that our packages outperform existing implementations of SLOPE in terms of speed.