FedSAUC: A Similarity-Aware Update Control for Communication-Efficient Federated Learning in Edge Computing

📅 2021-11-17
🏛️ International Conference on Mobile Computing and Ubiquitous Networking
📈 Citations: 6
Influential: 0
📄 PDF
🤖 AI Summary
To address the high communication overhead and energy consumption in federated learning (FL) under edge computing constraints—particularly limited battery life and bandwidth—this paper proposes a similarity-aware update control mechanism. The server clusters devices based on model parameter similarity using K-means or DBSCAN, and selects only one representative device per cluster for global model aggregation. Crucially, this work is the first to incorporate user-behavior-informed model similarity into FL update scheduling, enabling sparse communication without accuracy loss. Experiments on a Raspberry Pi–Android prototype platform and multiple benchmark datasets demonstrate a 40–65% reduction in communication volume and substantial decreases in client-side energy consumption. Long-term evaluation shows no statistically significant difference in test accuracy compared to FedAvg, with fluctuations under 0.3%.

Technology Category

Application Category

📝 Abstract
Federated learning is a distributed machine learning framework to collaboratively train a global model without uploading privacy-sensitive data onto a centralized server. Usually, this framework is applied to edge devices such as smartphones, wearable devices, and Internet of Things (IoT) devices which closely collect information from users. However, these devices are mostly battery-powered. The update procedure of federated learning will constantly consume the battery power and the transmission bandwidth. In this work, we propose an update control for federated learning, FedSAUC, by considering the similarity of users’ behaviors (models). At the server side, we exploit clustering algorithms to group devices with similar models. Then we select some representatives for each cluster to update information to train the model. We also implemented a testbed prototyping on edge devices for validating the performance. The experimental results show that this update control will not affect the training accuracy in the long run.
Problem

Research questions and friction points this paper is trying to address.

Reduces battery and bandwidth consumption in federated learning
Groups devices by similarity to optimize updates
Maintains training accuracy while improving efficiency
Innovation

Methods, ideas, or system contributions that make the work stand out.

Similarity-aware update control for federated learning
Clustering algorithms group similar edge devices
Representative devices update to save resources
🔎 Similar Papers
No similar papers found.
M
Ming-Lun Lee
Dept. of Computer Science and Engineering, Yuan Ze University, Taoyuan, Taiwan
H
Han-Chang Chou
Dept. of Computer Science and Engineering, Yuan Ze University, Taoyuan, Taiwan
Yan-Ann Chen
Yan-Ann Chen
Dept. of Computer Science and Engineering, Yuan Zu University
Pervasive IntelligenceInternet of ThingsArtificial Intelligence of ThingsCyber-Physical System