Lightweight Federated Learning over Wireless Edge Networks

📅 2025-07-13
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the high communication overhead, excessive energy consumption, and unreliable convergence in federated learning (FL) deployment over wireless edge networks, this paper proposes a lightweight joint optimization framework. We establish a closed-form expression for the convergence gap that unifies transmit power control, model pruning, and gradient quantization—marking the first work to jointly optimize these three strategies under dual constraints of latency and energy. Bayesian optimization is further introduced to enable efficient hyperparameter tuning. Extensive experiments on real-world datasets demonstrate that the proposed method reduces communication load by up to 68% and client energy consumption by up to 52% compared to state-of-the-art approaches, while incurring less than 1.2% accuracy degradation. The framework significantly enhances training efficiency and practical feasibility of FL in resource-constrained wireless edge environments.

Technology Category

Application Category

📝 Abstract
With the exponential growth of smart devices connected to wireless networks, data production is increasing rapidly, requiring machine learning (ML) techniques to unlock its value. However, the centralized ML paradigm raises concerns over communication overhead and privacy. Federated learning (FL) offers an alternative at the network edge, but practical deployment in wireless networks remains challenging. This paper proposes a lightweight FL (LTFL) framework integrating wireless transmission power control, model pruning, and gradient quantization. We derive a closed-form expression of the FL convergence gap, considering transmission error, model pruning error, and gradient quantization error. Based on these insights, we formulate an optimization problem to minimize the convergence gap while meeting delay and energy constraints. To solve the non-convex problem efficiently, we derive closed-form solutions for the optimal model pruning ratio and gradient quantization level, and employ Bayesian optimization for transmission power control. Extensive experiments on real-world datasets show that LTFL outperforms state-of-the-art schemes.
Problem

Research questions and friction points this paper is trying to address.

Reducing communication overhead in federated learning at wireless edge networks
Minimizing FL convergence gap with transmission and quantization errors
Optimizing model pruning and power control under resource constraints
Innovation

Methods, ideas, or system contributions that make the work stand out.

Lightweight FL with power control
Model pruning and gradient quantization
Bayesian optimization for power control
🔎 Similar Papers
No similar papers found.
Xiangwang Hou
Xiangwang Hou
Department of EE, Tsinghua University
Wireless Federated LearningEdge IntelligenceUAV/AUV Swarm
Jingjing Wang
Jingjing Wang
Professor, School of Cyber Science and Technology, Beihang University
AI for WirelessUAV NetworksSpace-Air-Ground-Sea NetworksCommunication Security
J
Jun Du
Department of Electronic Engineering, Tsinghua University, Beijing, 100084, China
C
Chunxiao Jiang
Tsinghua Space Center, Tsinghua University, Beijing, 100084, China
Yong Ren
Yong Ren
Institute of Automation, Chinese Academy of Sciences
Speech CodecText-to-speechVideo-to-audioMLLMContinual Learning
D
Dusit Niyato
College of Computing and Data Science, Nanyang Technological University, Singapore 639798