Graph-based Gossiping for Communication Efficiency in Decentralized Federated Learning

📅 2025-06-12
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Decentralized federated learning suffers from high communication overhead and poor topology adaptability due to full decentralization, while existing approaches lack validation in realistic distributed environments. Method: This paper proposes a graph-structured adaptive gossip communication mechanism. It is the first to jointly integrate minimum spanning tree construction and graph coloring to co-model model parameter size and network latency, enabling dynamic subnet formation and message routing optimization across heterogeneous topologies and device capabilities. Contribution/Results: We deploy the distributed gossip protocol on a real physical network—including router-level configurations. Compared to flooding-based broadcast, our approach reduces bandwidth consumption by approximately 8× and decreases end-to-end transmission time by ~4.4×, significantly improving communication efficiency and system scalability.

Technology Category

Application Category

📝 Abstract
Federated learning has emerged as a privacy-preserving technique for collaborative model training across heterogeneously distributed silos. Yet, its reliance on a single central server introduces potential bottlenecks and risks of single-point failure. Decentralizing the server, often referred to as decentralized learning, addresses this problem by distributing the server role across nodes within the network. One drawback regarding this pure decentralization is it introduces communication inefficiencies, which arise from increased message exchanges in large-scale setups. However, existing proposed solutions often fail to simulate the real-world distributed and decentralized environment in their experiments, leading to unreliable performance evaluations and limited applicability in practice. Recognizing the lack from prior works, this work investigates the correlation between model size and network latency, a critical factor in optimizing decentralized learning communication. We propose a graph-based gossiping mechanism, where specifically, minimum spanning tree and graph coloring are used to optimize network structure and scheduling for efficient communication across various network topologies and message capacities. Our approach configures and manages subnetworks on real physical routers and devices and closely models real-world distributed setups. Experimental results demonstrate that our method significantly improves communication, compatible with different topologies and data sizes, reducing bandwidth and transfer time by up to circa 8 and 4.4 times, respectively, compared to naive flooding broadcasting methods.
Problem

Research questions and friction points this paper is trying to address.

Optimizing communication in decentralized federated learning
Reducing network latency for large model sizes
Improving scalability across diverse network topologies
Innovation

Methods, ideas, or system contributions that make the work stand out.

Graph-based gossiping for decentralized learning
Minimum spanning tree optimizes network structure
Graph coloring enhances communication scheduling
🔎 Similar Papers
No similar papers found.