🤖 AI Summary
Existing federated learning methods for nonsmooth convex optimization lack theoretical guarantees for multi-step local updates in the absence of data heterogeneity assumptions.
Method: We propose FedMLS, the first algorithm to establish a tight $O(1/varepsilon)$ communication-round complexity bound for nonsmooth convex federated optimization under *no data heterogeneity assumption whatsoever*. FedMLS integrates stochastic subgradient-based local updates with server-side projection-based calibration, ensuring both computational efficiency and theoretical rigor.
Results: To achieve an $varepsilon$-suboptimal solution, FedMLS requires only $O(1/varepsilon)$ communication rounds and $O(1/varepsilon^2)$ stochastic subgradient queries—significantly improving upon prior assumption-free methods and substantially reducing communication overhead.
📝 Abstract
Multiple local steps are key to communication-efficient federated learning. However, theoretical guarantees for such algorithms, without data heterogeneity-bounding assumptions, have been lacking in general non-smooth convex problems. Leveraging projection-efficient optimization methods, we propose FedMLS, a federated learning algorithm with provable improvements from multiple local steps. FedMLS attains an $epsilon$-suboptimal solution in $mathcal{O}(1/epsilon)$ communication rounds, requiring a total of $mathcal{O}(1/epsilon^2)$ stochastic subgradient oracle calls.