🤖 AI Summary
Severe data heterogeneity among clients degrades model performance and convergence in federated learning. To address this, we propose a novel federated orchestration mechanism inspired by social opinion dynamics: first, we model inter-client opinion evolution to enable asymptotic-consensus-driven construction of dynamic, non-overlapping client coalitions; second, we design a variance-aware representative client selection framework that adaptively selects the client with minimal update variance using a Boltzmann exploration strategy. This is the first work to systematically integrate asymptotic consensus into the federated client coordination architecture. Extensive experiments under diverse strong heterogeneity settings demonstrate that our method significantly improves model accuracy (average gain of +2.1%–5.7%) and accelerates convergence (1.8×–3.2× speedup), consistently outperforming FedAvg, FedProx, and SCAFFOLD.
📝 Abstract
Federated Learning (FL) enables privacy-preserving collaborative model training, yet its practical strength is often undermined by client data heterogeneity, which severely degrades model performance. This paper proposes that data heterogeneity across clients' distributions can be effectively addressed by adopting an approach inspired by opinion dynamics over temporal social networks. We introduce shortname (Federated Coalition Variance Reduction with Boltzmann Exploration), a variance-reducing selection algorithm in which (1) clients dynamically organize into non-overlapping clusters based on asymptotic agreements, and (2) from each cluster, one client is selected to minimize the expected variance of its model update. Our experiments show that in heterogeneous scenarios our algorithm outperforms existing FL algorithms, yielding more accurate results and faster convergence, validating the efficacy of our approach.