Provable Reduction in Communication Rounds for Non-Smooth Convex Federated Learning

📅 2025-03-27
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing federated learning methods for nonsmooth convex optimization lack theoretical guarantees for multi-step local updates in the absence of data heterogeneity assumptions. Method: We propose FedMLS, the first algorithm to establish a tight $O(1/varepsilon)$ communication-round complexity bound for nonsmooth convex federated optimization under *no data heterogeneity assumption whatsoever*. FedMLS integrates stochastic subgradient-based local updates with server-side projection-based calibration, ensuring both computational efficiency and theoretical rigor. Results: To achieve an $varepsilon$-suboptimal solution, FedMLS requires only $O(1/varepsilon)$ communication rounds and $O(1/varepsilon^2)$ stochastic subgradient queries—significantly improving upon prior assumption-free methods and substantially reducing communication overhead.

Technology Category

Application Category

📝 Abstract
Multiple local steps are key to communication-efficient federated learning. However, theoretical guarantees for such algorithms, without data heterogeneity-bounding assumptions, have been lacking in general non-smooth convex problems. Leveraging projection-efficient optimization methods, we propose FedMLS, a federated learning algorithm with provable improvements from multiple local steps. FedMLS attains an $epsilon$-suboptimal solution in $mathcal{O}(1/epsilon)$ communication rounds, requiring a total of $mathcal{O}(1/epsilon^2)$ stochastic subgradient oracle calls.
Problem

Research questions and friction points this paper is trying to address.

Reducing communication rounds in federated learning
Addressing non-smooth convex problems without data assumptions
Provable efficiency gains from multiple local steps
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses projection-efficient optimization methods
Proposes FedMLS algorithm for federated learning
Achieves O(1/ε) communication rounds
🔎 Similar Papers
No similar papers found.
K
Karlo Palenzuela
Department of Computing Science, Ume ˚a University, Ume ˚a, Sweden
Ali Dadras
Ali Dadras
PhD, Umeå University
Machine LearningOptimizationApplied Mathematics
Alp Yurtsever
Alp Yurtsever
Umeå University
Machine LearningOptimizationApplied Mathematics
T
Tommy Lofstedt
Department of Computing Science, Ume ˚a University, Ume ˚a, Sweden