A Unified Convergence Analysis for Semi-Decentralized Learning: Sampled-to-Sampled vs. Sampled-to-All Communication

📅 2025-11-14
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In semi-decentralized federated learning (FL), server-to-client model distribution adopts either sample-aware dissemination (S2S)—sending the aggregated model only to clients sampled in the current round—or all-client broadcasting (S2A); however, no theoretical comparison or practical guidance exists for selecting between them. Method: We establish a unified convergence analysis framework that systematically characterizes the convergence behaviors of S2S and S2A under heterogeneous data distributions, client sampling rates, aggregation frequencies, and network connectivity. Contribution/Results: Our theoretical and empirical analysis reveals that data heterogeneity is the decisive factor governing optimal strategy selection: S2A significantly accelerates convergence under high heterogeneity, whereas S2S achieves superior efficiency under low heterogeneity. This work provides actionable, theory-grounded criteria for communication-mode selection in real-world semi-decentralized FL systems, thereby filling a critical theoretical gap and offering principled engineering guidance for model distribution strategies.

Technology Category

Application Category

📝 Abstract
In semi-decentralized federated learning, devices primarily rely on device-to-device communication but occasionally interact with a central server. Periodically, a sampled subset of devices uploads their local models to the server, which computes an aggregate model. The server can then either (i) share this aggregate model only with the sampled clients (sampled-to-sampled, S2S) or (ii) broadcast it to all clients (sampled-to-all, S2A). Despite their practical significance, a rigorous theoretical and empirical comparison of these two strategies remains absent. We address this gap by analyzing S2S and S2A within a unified convergence framework that accounts for key system parameters: sampling rate, server aggregation frequency, and network connectivity. Our results, both analytical and experimental, reveal distinct regimes where one strategy outperforms the other, depending primarily on the degree of data heterogeneity across devices. These insights lead to concrete design guidelines for practical semi-decentralized FL deployments.
Problem

Research questions and friction points this paper is trying to address.

Compares sampled-to-sampled vs sampled-to-all communication strategies
Analyzes convergence under data heterogeneity and system parameters
Provides design guidelines for semi-decentralized federated learning deployments
Innovation

Methods, ideas, or system contributions that make the work stand out.

Analyzes semi-decentralized learning with sampled communication
Compares sampled-to-sampled and sampled-to-all strategies
Provides unified convergence framework for system parameters
🔎 Similar Papers
2023-11-30IEEE International Conference on Acoustics, Speech, and Signal ProcessingCitations: 1