🤖 AI Summary
To address the single-point failure vulnerability in federated learning and the high communication overhead and low consensus efficiency of peer-to-peer learning, this paper proposes a Hub-Spoke collaborative learning framework. It introduces a two-layer graph-topology-based hybrid communication architecture that synergistically combines centralized robustness with decentralized scalability. A graph-structured local model mixing mechanism is designed to eliminate the global aggregation bottleneck, and theoretical convergence guarantees are provided alongside a distributed optimization algorithm. Experiments on CIFAR-10 demonstrate that the framework achieves the same test accuracy as the ELL method using only 400 communication edges—versus ELL’s 1,000—yielding significant accuracy gains under identical communication budgets, reduced training rounds, and accelerated node consensus. This paradigm establishes a novel pathway toward efficient, fault-tolerant, and communication-efficient collaborative learning.
📝 Abstract
We introduce the Hubs and Spokes Learning (HSL) framework, a novel paradigm for collaborative machine learning that combines the strengths of Federated Learning (FL) and Decentralized Learning (P2PL). HSL employs a two-tier communication structure that avoids the single point of failure inherent in FL and outperforms the state-of-the-art P2PL framework, Epidemic Learning Local (ELL). At equal communication budgets (total edges), HSL achieves higher performance than ELL, while at significantly lower communication budgets, it can match ELL's performance. For instance, with only 400 edges, HSL reaches the same test accuracy that ELL achieves with 1000 edges for 100 peers (spokes) on CIFAR-10, demonstrating its suitability for resource-constrained systems. HSL also achieves stronger consensus among nodes after mixing, resulting in improved performance with fewer training rounds. We substantiate these claims through rigorous theoretical analyses and extensive experimental results, showcasing HSL's practicality for large-scale collaborative learning.