Communication-Efficient Federated Learning with Adaptive Number of Participants

📅 2025-08-19
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address high communication overhead and low efficiency caused by dynamic client heterogeneity in federated learning, this paper proposes ISP (Iterative Sample-size Planning), the first framework that models the number of participating clients per round as a learnable, dynamically optimized variable—departing from conventional fixed or heuristic selection paradigms. ISP adaptively determines client sample size based on gradient similarity and local update stability, seamlessly integrating with mainstream algorithms (e.g., FedAvg, FedProx) and gradient compression techniques. Experiments on vision Transformers and real-world ECG classification tasks demonstrate that ISP reduces average communication cost by 30% across diverse non-IID and dynamic-client-arrival settings, without compromising final model accuracy. The core contribution is establishing “client count as an optimization variable” as a novel paradigm, accompanied by a lightweight, plug-and-play adaptive solution.

Technology Category

Application Category

📝 Abstract
Rapid scaling of deep learning models has enabled performance gains across domains, yet it introduced several challenges. Federated Learning (FL) has emerged as a promising framework to address these concerns by enabling decentralized training. Nevertheless, communication efficiency remains a key bottleneck in FL, particularly under heterogeneous and dynamic client participation. Existing methods, such as FedAvg and FedProx, or other approaches, including client selection strategies, attempt to mitigate communication costs. However, the problem of choosing the number of clients in a training round remains extremely underexplored. We introduce Intelligent Selection of Participants (ISP), an adaptive mechanism that dynamically determines the optimal number of clients per round to enhance communication efficiency without compromising model accuracy. We validate the effectiveness of ISP across diverse setups, including vision transformers, real-world ECG classification, and training with gradient compression. Our results show consistent communication savings of up to 30% without losing the final quality. Applying ISP to different real-world ECG classification setups highlighted the selection of the number of clients as a separate task of federated learning.
Problem

Research questions and friction points this paper is trying to address.

Optimizing client participation in federated learning rounds
Reducing communication costs without sacrificing model accuracy
Addressing dynamic and heterogeneous client participation challenges
Innovation

Methods, ideas, or system contributions that make the work stand out.

Adaptive client selection mechanism
Dynamic optimal participant determination
Communication efficiency without accuracy loss
🔎 Similar Papers
No similar papers found.
S
Sergey Skorik
Ivannikov Institute for System Programming, Moscow, Russia
V
Vladislav Dorofeev
Ivannikov Institute for System Programming, Moscow, Russia
Gleb Molodtsov
Gleb Molodtsov
Researcher
A
Aram Avetisyan
Ivannikov Institute for System Programming, Moscow, Russia
D
Dmitry Bylinkin
Ivannikov Institute for System Programming, Moscow, Russia; Moscow Institute of Physics and Technology, Dolgoprudny, Russia
Daniil Medyakov
Daniil Medyakov
Unknown affiliation
Optimization
Aleksandr Beznosikov
Aleksandr Beznosikov
PhD, Basic Research of Artificial Intelligence Lab
OptimizationMachine Learning