๐ค AI Summary
This work addresses non-smooth non-convex optimization over decentralized networks, where each nodeโs objective function is Lipschitz continuous. The authors propose an efficient stochastic first-order method that incorporates client sampling and leverages spectral gap analysis of the mixing matrix to significantly reduce communication overhead while maintaining optimal sample complexity. The approach is further extended to the zeroth-order setting. Theoretical analysis shows that the algorithm reaches a $(\delta,\varepsilon)$-Goldstein stationary point with $\mathcal{O}(\delta^{-1}\varepsilon^{-3})$ sample and computational complexity, and achieves a communication complexity of $\tilde{\mathcal{O}}(\gamma^{-1/2}\delta^{-1}\varepsilon^{-3})$, outperforming existing methods. Experimental results confirm the algorithmโs effectiveness and superiority in practice.
๐ Abstract
This paper considers decentralized nonsmooth nonconvex optimization problem with Lipschitz continuous local functions. We propose an efficient stochastic first-order method with client sampling, achieving the $(\delta,\epsilon)$-Goldstein stationary point with the overall sample complexity of ${\mathcal O}(\delta^{-1}\epsilon^{-3})$, the computation rounds of ${\mathcal O}(\delta^{-1}\epsilon^{-3})$, and the communication rounds of ${\tilde{\mathcal O}}(\gamma^{-1/2}\delta^{-1}\epsilon^{-3})$, where $\gamma$ is the spectral gap of the mixing matrix for the network. Our results achieve the optimal sample complexity and the sharper communication complexity than existing methods. We also extend our ideas to zeroth-order optimization. Moreover, the numerical experiments show the empirical advantage of our methods.