Improved $ell_{p}$ Regression via Iteratively Reweighted Least Squares

📅 2025-10-02
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses the ℓₚ regression problem to bridge the performance gap between theoretical algorithms and practical applications. We propose a novel primal-dual optimization framework based on iterative reweighted least squares (IRLS), which exploits the invariance of the dual objective to derive simple, closed-form update rules—bypassing expensive subproblem solves. The method retains the optimal O(log(1/ε)) iteration complexity while requiring only lightweight matrix operations per iteration, drastically reducing per-step computational overhead. Rigorous theoretical analysis establishes global convergence and complexity optimality. Empirical evaluation demonstrates superior accuracy, speed, and robustness compared to the Adil–Peng–Sachdeva IRLS algorithm and mainstream solvers including MATLAB/CVX. The key innovation lies in the tight integration of primal-dual structure with IRLS, achieving both theoretical rigor and engineering practicality.

Technology Category

Application Category

📝 Abstract
We introduce fast algorithms for solving $ell_{p}$ regression problems using the iteratively reweighted least squares (IRLS) method. Our approach achieves state-of-the-art iteration complexity, outperforming the IRLS algorithm by Adil-Peng-Sachdeva (NeurIPS 2019) and matching the theoretical bounds established by the complex algorithm of Adil-Kyng-Peng-Sachdeva (SODA 2019, J. ACM 2024) via a simpler lightweight iterative scheme. This bridges the existing gap between theoretical and practical algorithms for $ell_{p}$ regression. Our algorithms depart from prior approaches, using a primal-dual framework, in which the update rule can be naturally derived from an invariant maintained for the dual objective. Empirically, we show that our algorithms significantly outperform both the IRLS algorithm by Adil-Peng-Sachdeva and MATLAB/CVX implementations.
Problem

Research questions and friction points this paper is trying to address.

Developing fast algorithms for ℓp regression problems
Achieving state-of-the-art iteration complexity via IRLS
Bridging theoretical-practical gap with primal-dual framework
Innovation

Methods, ideas, or system contributions that make the work stand out.

Using iteratively reweighted least squares method
Employing primal-dual framework for update rule
Achieving state-of-the-art iteration complexity
🔎 Similar Papers
No similar papers found.