Stability of Primal-Dual Gradient Flow Dynamics for Multi-Block Convex Optimization Problems

📅 2024-08-28
🏛️ arXiv.org
📈 Citations: 1
Influential: 0
📄 PDF
🤖 AI Summary
This work investigates the stability of primal-dual gradient flow dynamics for multi-block composite convex optimization under generalized consensus constraints, particularly targeting large-scale distributed settings involving multiple nonsmooth terms. We propose a continuous-time dynamical framework based on the proximal augmented Lagrangian and establish global exponential convergence via Lyapunov analysis. Compared with mainstream discrete-time algorithms such as ADMM and EXTRA, our approach significantly relaxes standard assumptions—namely, strong convexity and smoothness of objective components, as well as algebraic connectivity of the communication graph—and further proves the necessity of certain relaxed conditions. The theoretical results provide milder, more broadly applicable convergence guarantees for distributed nonsmooth optimization. Numerical experiments demonstrate the efficiency and practicality of the proposed dynamics in both parallel and distributed implementations.

Technology Category

Application Category

📝 Abstract
We examine stability properties of primal-dual gradient flow dynamics for composite convex optimization problems with multiple, possibly nonsmooth, terms in the objective function under the generalized consensus constraint. The proposed dynamics are based on the proximal augmented Lagrangian and they provide a viable alternative to ADMM which faces significant challenges from both analysis and implementation viewpoints in large-scale multi-block scenarios. In contrast to customized algorithms with individualized convergence guarantees, we provide a systematic approach for solving a broad class of challenging composite optimization problems. We leverage various structural properties to establish global (exponential) convergence guarantees for the proposed dynamics. Our assumptions are much weaker than those required to prove (exponential) stability of various primal-dual dynamics as well as (linear) convergence of discrete-time methods, e.g., standard two-block and multi-block ADMM and EXTRA algorithms. Finally, we show necessity of some of our structural assumptions for exponential stability and provide computational experiments to demonstrate the convenience of the proposed dynamics for parallel and distributed computing applications.
Problem

Research questions and friction points this paper is trying to address.

Analyze stability of primal-dual gradient flow for multi-block convex optimization
Propose alternative to ADMM for large-scale nonsmooth composite optimization
Establish global convergence guarantees under weaker structural assumptions
Innovation

Methods, ideas, or system contributions that make the work stand out.

Proximal augmented Lagrangian dynamics
Systematic approach for composite optimization
Global exponential convergence guarantees
🔎 Similar Papers
No similar papers found.
I
Ibrahim Kurban Özaslan
Ming Hsieh Department of Electrical Engineering, University of Southern California, Los Angeles, CA 90089
Panagiotis Patrinos
Panagiotis Patrinos
Associate Professor, KU Leuven
optimizationmachine learningsystems & control
M
Mihailo R. Jovanovic
Ming Hsieh Department of Electrical Engineering, University of Southern California, Los Angeles, CA 90089