FastPart: Over-Parameterized Stochastic Gradient Descent for Sparse optimisation on Measures

📅 2023-12-10
📈 Citations: 1
Influential: 1
📄 PDF
🤖 AI Summary
To address the scalability limitations of Conic Particle Gradient Descent (CPGD) for sparse optimization over measure spaces, this paper proposes FastPart—a novel algorithm integrating stochastic gradient descent with random feature mapping to substantially reduce CPGD’s computational and memory complexity. Theoretically, we establish, for the first time within a variational framework, a uniform bound on the total variation norm of optimal solution measures; derive a global convergence rate of $O(log K / sqrt{K})$; and provide a local controllability analysis for the deviation from first-order optimality conditions. Empirically, FastPart achieves significant improvements in computational efficiency and robustness for large-scale sparse measure optimization, while preserving solution stability and theoretical guarantees.
📝 Abstract
This paper presents a novel algorithm that leverages Stochastic Gradient Descent strategies in conjunction with Random Features to augment the scalability of Conic Particle Gradient Descent (CPGD) specifically tailored for solving sparse optimisation problems on measures. By formulating the CPGD steps within a variational framework, we provide rigorous mathematical proofs demonstrating the following key findings: (i) The total variation norms of the solution measures along the descent trajectory remain bounded, ensuring stability and preventing undesirable divergence; (ii) We establish a global convergence guarantee with a convergence rate of $mathcal{O}(log(K)/sqrt{K})$ over $K$ iterations, showcasing the efficiency and effectiveness of our algorithm; (iii) Additionally, we analyze and establish local control over the first-order condition discrepancy, contributing to a deeper understanding of the algorithm's behavior and reliability in practical applications.
Problem

Research questions and friction points this paper is trying to address.

Solves sparse optimization problems on measures
Augments scalability of Conic Particle Gradient Descent
Provides convergence guarantees for stochastic gradient descent
Innovation

Methods, ideas, or system contributions that make the work stand out.

Stochastic Gradient Descent with Random Features
Variational framework for CPGD steps
Global convergence guarantee with O(log(K)/√K) rate
🔎 Similar Papers
No similar papers found.