Interaction-Aware Gaussian Weighting for Clustered Federated Learning

📅 2025-02-05
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address performance degradation in federated learning (FL) caused by data heterogeneity and class imbalance, this paper proposes a client-interaction-aware clustering FL framework. Our method introduces a novel Gaussian-weighted interaction modeling mechanism that formalizes client-wise empirical loss correlations as a Gaussian reward function, jointly optimized with Wasserstein distance to measure class-level distribution consistency—thereby establishing a data-distribution-driven dynamic clustering criterion. We further design a distributed clustering optimization algorithm and a tailored federated aggregation strategy. Extensive experiments across multiple benchmark datasets demonstrate that our approach significantly improves both clustering purity and classification accuracy: it achieves average accuracy gains of 3.2–5.7% over state-of-the-art clustering-based FL methods. The framework effectively balances personalization, robustness to statistical heterogeneity, and decentralized training requirements, without relying on centralized coordination or ground-truth label information.

Technology Category

Application Category

📝 Abstract
Federated Learning (FL) emerged as a decentralized paradigm to train models while preserving privacy. However, conventional FL struggles with data heterogeneity and class imbalance, which degrade model performance. Clustered FL balances personalization and decentralized training by grouping clients with analogous data distributions, enabling improved accuracy while adhering to privacy constraints. This approach effectively mitigates the adverse impact of heterogeneity in FL. In this work, we propose a novel clustered FL method, FedGWC (Federated Gaussian Weighting Clustering), which groups clients based on their data distribution, allowing training of a more robust and personalized model on the identified clusters. FedGWC identifies homogeneous clusters by transforming individual empirical losses to model client interactions with a Gaussian reward mechanism. Additionally, we introduce the Wasserstein Adjusted Score, a new clustering metric for FL to evaluate cluster cohesion with respect to the individual class distribution. Our experiments on benchmark datasets show that FedGWC outperforms existing FL algorithms in cluster quality and classification accuracy, validating the efficacy of our approach.
Problem

Research questions and friction points this paper is trying to address.

Addresses data heterogeneity in Federated Learning
Improves model performance with class imbalance
Enhances cluster quality and classification accuracy
Innovation

Methods, ideas, or system contributions that make the work stand out.

Gaussian reward mechanism clustering
Wasserstein Adjusted Score metric
Improved cluster quality accuracy
🔎 Similar Papers
No similar papers found.