🤖 AI Summary
To address the straggler problem caused by communication heterogeneity in federated learning (FL), this paper proposes a semi-decentralized coded collaboration framework that requires no prior knowledge of global network topology. The core innovation lies in explicitly incorporating wireless channel diversity into a semi-decentralized FL architecture—replacing conventional adaptive topology-aware mechanisms with deterministic coded networking to enable lightweight, robust inter-client collaborative training. Theoretical analysis establishes joint guarantees on convergence rate bounds and communication outage probability. Simulation results demonstrate that, under highly heterogeneous wireless environments, the proposed method significantly improves both convergence speed and training stability, outperforming state-of-the-art baseline schemes.
📝 Abstract
To enhance straggler resilience in federated learning (FL) systems, a semi-decentralized approach has been recently proposed, enabling collaboration between clients. Unlike the existing semi-decentralized schemes, which adaptively adjust the collaboration weight according to the network topology, this letter proposes a deterministic coded network that leverages wireless diversity for semi-decentralized FL without requiring prior information about the entire network. Furthermore, the theoretical analyses of the outage and the convergence rate of the proposed scheme are provided. Finally, the superiority of our proposed method over benchmark methods is demonstrated through comprehensive simulations.