Federated learning over physical channels: adaptive algorithms with near-optimal guarantees

📅 2025-09-02
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address high communication overhead and physical-layer transmission impairments—such as channel noise and hardware constraints—in federated learning, this paper proposes an adaptive Federated Stochastic Gradient Descent (FedSGD) algorithm tailored for Over-the-Air Computation (AirComp). The method introduces the first adaptive, noise-robust gradient aggregation scheme operating directly over noisy physical channels, dynamically adjusting the learning rate to compensate for channel-induced gradient distortion. We theoretically establish that the algorithm achieves a convergence rate asymptotically approaching the optimal bound of centralized SGD, while rigorously characterizing how convergence depends on both channel noise power and device heterogeneity (e.g., data and system non-IIDness). Extensive simulations with deep neural networks demonstrate the algorithm’s efficiency and robustness under realistic, non-ideal wireless channels and resource-constrained edge devices.

Technology Category

Application Category

📝 Abstract
In federated learning, communication cost can be significantly reduced by transmitting the information over the air through physical channels. In this paper, we propose a new class of adaptive federated stochastic gradient descent (SGD) algorithms that can be implemented over physical channels, taking into account both channel noise and hardware constraints. We establish theoretical guarantees for the proposed algorithms, demonstrating convergence rates that are adaptive to the stochastic gradient noise level. We also demonstrate the practical effectiveness of our algorithms through simulation studies with deep learning models.
Problem

Research questions and friction points this paper is trying to address.

Develops adaptive federated SGD algorithms for physical channels
Addresses channel noise and hardware constraints in federated learning
Establishes convergence guarantees adaptive to gradient noise levels
Innovation

Methods, ideas, or system contributions that make the work stand out.

Adaptive federated SGD over physical channels
Algorithms handle channel noise and constraints
Convergence rates adapt to gradient noise
🔎 Similar Papers
R
Rui Zhang
Department of Electrical Engineering, SUNY University at Buffalo
Wenlong Mou
Wenlong Mou
University of Toronto
machine learningstatisticsoptimizationapplied probability