🤖 AI Summary
In decentralized federated learning, heterogeneous client data distributions lead to insufficient personalization and high communication overhead. Method: This paper proposes the first soft-clustering-based decentralized personalized learning framework. It establishes cross-client data cluster consensus via distributed soft clustering, enabling collaborative optimization of personalized model aggregation strategies in a serverless peer-to-peer network, while supporting selective model updates and guaranteed consensus convergence under low connectivity. Contribution/Results: Compared with existing decentralized personalized approaches, our method achieves significant accuracy gains—especially in sparse-connectivity scenarios—on real-world datasets, while reducing communication costs by 30%–50%. The framework thus jointly improves personalization performance and communication efficiency.
📝 Abstract
Federated learning has recently gained popularity as a framework for distributed clients to collaboratively train a machine learning model using local data. While traditional federated learning relies on a central server for model aggregation, recent advancements adopt a decentralized framework, enabling direct model exchange between clients and eliminating the single point of failure. However, existing decentralized frameworks often assume all clients train a shared model. Personalizing each client's model can enhance performance, especially with heterogeneous client data distributions. We propose FedSPD, an efficient personalized federated learning algorithm for the decentralized setting, and show that it learns accurate models even in low-connectivity networks. To provide theoretical guarantees on convergence, we introduce a clustering-based framework that enables consensus on models for distinct data clusters while personalizing to unique mixtures of these clusters at different clients. This flexibility, allowing selective model updates based on data distribution, substantially reduces communication costs compared to prior work on personalized federated learning in decentralized settings. Experimental results on real-world datasets show that FedSPD outperforms multiple decentralized variants of personalized federated learning algorithms, especially in scenarios with low-connectivity networks.