π€ AI Summary
This work addresses the vulnerability of node communications in decentralized networks to inference by curious neighbors, a challenge exacerbated by the absence of general, low-overhead protocol-level privacy-preserving mechanisms. To this end, the authors propose DPPS, a lightweight differentially private protocol that, for the first time, dynamically estimates sensitivity at the protocol layer by broadcasting a single scalar per round. Integrated with parameter partitioning and partial communication, DPPS yields PartPSPβa non-convex optimization algorithm enabling plug-and-play privacy protection. The method achieves a superior utility-privacy trade-off while guaranteeing differential privacy. Theoretical analysis establishes the convergence of PartPSP under non-convex objectives, and empirical evaluations demonstrate its significant outperformance over existing decentralized private optimization algorithms under identical privacy budgets.
π Abstract
In decentralized networks, nodes cannot ensure that their shared information will be securely preserved by their neighbors, making privacy vulnerable to inference by curious nodes. Adding calibrated random noise before communication to satisfy differential privacy offers a proven defense; however, most existing methods are tailored to specific downstream tasks and lack a general, protocol-level privacy-preserving solution. To bridge this gap, we propose Differentially Private Perturbed Push-Sum (DPPS), a lightweight differential privacy protocol for decentralized communication. Since protocol-level differential privacy introduces the unique challenge of obtaining the sensitivity for each communication round, DPPS introduces a novel sensitivity estimation mechanism that requires each node to compute and broadcast only one scalar per round, enabling rigorous differential privacy guarantees. This design allows DPPS to serve as a plug-and-play, low-cost privacy-preserving solution for downstream applications built on it. To provide a concrete instantiation of DPPS and better balance the privacy-utility trade-off, we design PartPSP, a privacy-preserving decentralized algorithm for non-convex optimization that integrates a partial communication mechanism. By partitioning model parameters into local and shared components and applying DPPS only to the shared parameters, PartPSP reduces the dimensionality of consensus data, thereby lowering the magnitude of injected noise and improving optimization performance. We theoretically prove that PartPSP converges under non-convex objectives and, with partial communication, achieves better optimization performance under the same privacy budget. Experimental results validate the effectiveness of DPPS's privacy-preserving and demonstrate that PartPSP outperforms existing privacy-preserving decentralized optimization algorithms.