π€ AI Summary
This work addresses decentralized non-convex optimization over row-stochastic networks in the presence of heavy-tailed gradient noise. The authors propose a novel algorithm that integrates normalized stochastic gradient descent with Pull-Diag gradient tracking. To the best of our knowledge, this is the first method to simultaneously achieve optimal sample complexity and near-optimal communication complexity under heavy-tailed noise, applicable to both directed and undirected network topologies. Theoretical analysis establishes nearly tight upper bounds on these complexities, while empirical evaluations demonstrate the algorithmβs superior performance in practical scenarios.
π Abstract
This paper studies decentralized stochastic nonconvex optimization problem over row-stochastic networks. We consider the heavy-tailed gradient noise which is empirically observed in many popular real-world applications. Specifically, we propose a decentralized normalized stochastic gradient descent with Pull-Diag gradient tracking, which achieves approximate stationary points with the optimal sample complexity and the near-optimal communication complexity. We further follow our framework to study the setting of undirected networks, also achieving the nearly tight upper complexity bounds. Moreover, we conduct empirical studies to show the practical superiority of the proposed methods.