🤖 AI Summary
To address the high communication overhead, training instability, and low privacy credibility in decentralized federated learning (DFL), this paper proposes a sparsity-aware modeling framework with partial neighbor selection, introducing SDFL—the first decentralized learning framework under explicit sparsity constraints. Methodologically, it integrates three key components: (1) the CEPS optimizer, which guarantees convergence and satisfies ε-differential privacy; (2) single-bit compressed sensing for ultra-low-bandwidth communication; and (3) graph-topology-aware distributed stochastic gradient descent coupled with sparse optimization. Theoretically, SDFL is proven to converge under non-convex settings while strictly adhering to a pre-specified privacy budget. Empirically, on multiple benchmark datasets, SDFL significantly reduces both communication and computational costs—by up to an order of magnitude—while preserving model accuracy and training robustness.
📝 Abstract
Decentralized Federated Learning (DFL) enables collaborative model training without a central server but faces challenges in efficiency, stability, and trustworthiness due to communication and computational limitations among distributed nodes. To address these critical issues, we introduce a sparsity constraint on the shared model, leading to Sparse DFL (SDFL), and propose a novel algorithm, CEPS. The sparsity constraint facilitates the use of one-bit compressive sensing to transmit one-bit information between partially selected neighbour nodes at specific steps, thereby significantly improving communication efficiency. Moreover, we integrate differential privacy into the algorithm to ensure privacy preservation and bolster the trustworthiness of the learning process. Furthermore, CEPS is underpinned by theoretical guarantees regarding both convergence and privacy. Numerical experiments validate the effectiveness of the proposed algorithm in improving communication and computation efficiency while maintaining a high level of trustworthiness.