Distributed Stochastic Momentum Tracking with Local Updates: Achieving Optimal Communication and Iteration Complexities

📅 2025-10-28
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the fundamental trade-off between high communication overhead and low convergence efficiency in distributed stochastic optimization, this paper proposes Local Momentum Tracking (LMT). LMT integrates multi-step local updates, momentum tracking, and loopless Chebyshev acceleration (LCA), enabling multiple local computations per communication round while preserving global coordination and computational efficiency. Theoretically, when the number of local updates is appropriately chosen, LMT simultaneously achieves optimal communication complexity O(1/ε) and iteration complexity O(1/ε), marking the first linear speedup under multi-step local update settings without requiring strong convexity or second-order smoothness assumptions. Empirical evaluations demonstrate that LMT significantly outperforms state-of-the-art methods in bandwidth-constrained networks, effectively breaking the communication–computation trade-off bottleneck.

Technology Category

Application Category

📝 Abstract
We propose Local Momentum Tracking (LMT), a novel distributed stochastic gradient method for solving distributed optimization problems over networks. To reduce communication overhead, LMT enables each agent to perform multiple local updates between consecutive communication rounds. Specifically, LMT integrates local updates with the momentum tracking strategy and the Loopless Chebyshev Acceleration (LCA) technique. We demonstrate that LMT achieves linear speedup with respect to the number of local updates as well as the number of agents for minimizing smooth objective functions. Moreover, with sufficiently many local updates ($Qgeq Q^*$), LMT attains the optimal communication complexity. For a moderate number of local updates ($Qin[1,Q^*]$), it achieves the optimal iteration complexity. To our knowledge, LMT is the first method that enjoys such properties.
Problem

Research questions and friction points this paper is trying to address.

Reduces communication overhead in distributed optimization networks
Achieves optimal communication complexity with sufficient local updates
Attains optimal iteration complexity with moderate local updates
Innovation

Methods, ideas, or system contributions that make the work stand out.

Local Momentum Tracking reduces communication overhead
Integrates momentum tracking with Loopless Chebyshev Acceleration
Achieves optimal communication and iteration complexities
🔎 Similar Papers
K
Kun Huang
The Chinese University of Hong Kong, Shenzhen School of Data Science (SDS) Shenzhen, Guangdong, China
Shi Pu
Shi Pu
贵州电信 China Telecom Guizhou Branch
Computer vision