Sociodynamics-inspired Adaptive Coalition and Client Selection in Federated Learning

📅 2025-06-03
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Severe data heterogeneity among clients degrades model performance and convergence in federated learning. To address this, we propose a novel federated orchestration mechanism inspired by social opinion dynamics: first, we model inter-client opinion evolution to enable asymptotic-consensus-driven construction of dynamic, non-overlapping client coalitions; second, we design a variance-aware representative client selection framework that adaptively selects the client with minimal update variance using a Boltzmann exploration strategy. This is the first work to systematically integrate asymptotic consensus into the federated client coordination architecture. Extensive experiments under diverse strong heterogeneity settings demonstrate that our method significantly improves model accuracy (average gain of +2.1%–5.7%) and accelerates convergence (1.8×–3.2× speedup), consistently outperforming FedAvg, FedProx, and SCAFFOLD.

Technology Category

Application Category

📝 Abstract
Federated Learning (FL) enables privacy-preserving collaborative model training, yet its practical strength is often undermined by client data heterogeneity, which severely degrades model performance. This paper proposes that data heterogeneity across clients' distributions can be effectively addressed by adopting an approach inspired by opinion dynamics over temporal social networks. We introduce shortname (Federated Coalition Variance Reduction with Boltzmann Exploration), a variance-reducing selection algorithm in which (1) clients dynamically organize into non-overlapping clusters based on asymptotic agreements, and (2) from each cluster, one client is selected to minimize the expected variance of its model update. Our experiments show that in heterogeneous scenarios our algorithm outperforms existing FL algorithms, yielding more accurate results and faster convergence, validating the efficacy of our approach.
Problem

Research questions and friction points this paper is trying to address.

Addressing client data heterogeneity in Federated Learning
Dynamic client clustering based on asymptotic agreements
Minimizing model update variance for faster convergence
Innovation

Methods, ideas, or system contributions that make the work stand out.

Sociodynamics-inspired adaptive client clustering
Variance-reducing selection via Boltzmann exploration
Non-overlapping clusters for heterogeneous data