Exploiting Similarity for Computation and Communication-Efficient Decentralized Optimization

📅 2025-06-06
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the challenge of jointly minimizing communication and computational overhead in decentralized optimization, this paper proposes Stabilized Proximal Decentralized Optimization (SPDO). Unlike existing proximal methods—which require high-precision subproblem solutions to exploit functional similarity—SPDO is the first to relax subproblem accuracy within the Proximal Decentralized Optimization (PDO) framework, while introducing average functional similarity modeling and a stability-preservation mechanism. Theoretically, SPDO achieves state-of-the-art (SOTA) optimal communication and computational complexity. Empirically, on diverse network topologies and non-i.i.d. data distributions, SPDO reduces the number of communication rounds by 30%–50% and cuts total computation time by over 40%, demonstrating significant efficiency gains without sacrificing convergence stability or accuracy.

Technology Category

Application Category

📝 Abstract
Reducing communication complexity is critical for efficient decentralized optimization. The proximal decentralized optimization (PDO) framework is particularly appealing, as methods within this framework can exploit functional similarity among nodes to reduce communication rounds. Specifically, when local functions at different nodes are similar, these methods achieve faster convergence with fewer communication steps. However, existing PDO methods often require highly accurate solutions to subproblems associated with the proximal operator, resulting in significant computational overhead. In this work, we propose the Stabilized Proximal Decentralized Optimization (SPDO) method, which achieves state-of-the-art communication and computational complexities within the PDO framework. Additionally, we refine the analysis of existing PDO methods by relaxing subproblem accuracy requirements and leveraging average functional similarity. Experimental results demonstrate that SPDO significantly outperforms existing methods.
Problem

Research questions and friction points this paper is trying to address.

Reducing communication complexity in decentralized optimization
Exploiting functional similarity to decrease communication rounds
Minimizing computational overhead in proximal operator subproblems
Innovation

Methods, ideas, or system contributions that make the work stand out.

Proposes Stabilized Proximal Decentralized Optimization method
Reduces communication rounds via functional similarity
Relaxes subproblem accuracy to cut computational overhead
🔎 Similar Papers
No similar papers found.
Yuki Takezawa
Yuki Takezawa
Kyoto University / OIST
Machine LearningOptimizationOptimal Transport
X
Xiaowen Jiang
CISPA Helmholtz Center for Information Security, Saarland University
Anton Rodomanov
Anton Rodomanov
CISPA Helmholtz Center for Information Security
OptimizationMachine LearningNumerical MethodsComplexity Guarantees
S
Sebastian U. Stich
CISPA Helmholtz Center for Information Security