Whisper D-SGD: Correlated Noise Across Agents for Differentially Private Decentralized Learning

📅 2025-01-24
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In decentralized learning, local differential privacy (LDP) severely degrades model utility compared to centralized differential privacy (CDP). To address this, we propose a cross-agent correlated noise mechanism based on covariance modeling. Our approach preserves data locality—no raw data is shared—and introduces, for the first time, a network topology-aware covariance optimization framework to enable global noise cancellation. By integrating hybrid weight design with graph signal processing techniques, we unify and generalize several state-of-the-art LDP schemes. Theoretical analysis and extensive experiments demonstrate that, under identical privacy budgets, our method significantly improves model accuracy over existing LDP approaches. It achieves superior noise cancellation compared to prior pairwise correlation mechanisms and substantially narrows the performance gap between CDP and LDP—bridging up to 70% of the utility loss in benchmark tasks.

Technology Category

Application Category

📝 Abstract
Decentralized learning enables distributed agents to train a shared machine learning model through local computation and peer-to-peer communication. Although each agent retains its dataset locally, the communication of local models can still expose private information to adversaries. To mitigate these threats, local differential privacy (LDP) injects independent noise per agent, but it suffers a larger utility gap than central differential privacy (CDP). We introduce Whisper D-SGD, a novel covariance-based approach that generates correlated privacy noise across agents, unifying several state-of-the-art methods as special cases. By leveraging network topology and mixing weights, Whisper D-SGD optimizes the noise covariance to achieve network-wide noise cancellation. Experimental results show that Whisper D-SGD cancels more noise than existing pairwise-correlation schemes, substantially narrowing the CDP-LDP gap and improving model performance under the same privacy guarantees.
Problem

Research questions and friction points this paper is trying to address.

Decentralized Learning
Local Differential Privacy (LDP)
Model Performance Gap
Innovation

Methods, ideas, or system contributions that make the work stand out.

Whisper D-SGD
Local Differential Privacy (LDP)
Decentralized Learning
🔎 Similar Papers
No similar papers found.