Differentially Private Decentralized Dataset Synthesis Through Randomized Mixing with Correlated Noise

📅 2025-09-12
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In decentralized data settings, differentially private synthetic data generation suffers from a privacy–utility trade-off—particularly for clients with small local datasets, where high noise injection severely degrades utility. To address this, we propose an enhanced federated DP-CDA framework. Its core innovation is the CAPE protocol, enabling cross-client correlated-noise coordination such that noise partially cancels during server-side aggregation, thereby simultaneously preserving individual privacy guarantees and improving synthetic data quality. The framework integrates class-center-aware random mixing, controllable Gaussian noise injection, and rigorous (ε, δ)-differential privacy analysis. Experiments on MNIST and FashionMNIST demonstrate that our method achieves synthetic data utility approaching centralized non-federated baselines while satisfying (ε, δ)-DP—substantially outperforming standard federated DP synthetic data approaches.

Technology Category

Application Category

📝 Abstract
In this work, we explore differentially private synthetic data generation in a decentralized-data setting by building on the recently proposed Differentially Private Class-Centric Data Aggregation (DP-CDA). DP-CDA synthesizes data in a centralized setting by mixing multiple randomly-selected samples from the same class and injecting carefully calibrated Gaussian noise, ensuring (ε, δ)-differential privacy. When deployed in a decentralized or federated setting, where each client holds only a small partition of the data, DP-CDA faces new challenges. The limited sample size per client increases the sensitivity of local computations, requiring higher noise injection to maintain the differential privacy guarantee. This, in turn, leads to a noticeable degradation in the utility compared to the centralized setting. To mitigate this issue, we integrate the Correlation-Assisted Private Estimation (CAPE) protocol into the federated DP-CDA framework and propose CAPE Assisted Federated DP-CDA algorithm. CAPE enables limited collaboration among the clients by allowing them to generate jointly distributed (anti-correlated) noise that cancels out in aggregate, while preserving privacy at the individual level. This technique significantly improves the privacy-utility trade-off in the federated setting. Extensive experiments on MNIST and FashionMNIST datasets demonstrate that the proposed CAPE Assisted Federated DP-CDA approach can achieve utility comparable to its centralized counterpart under some parameter regime, while maintaining rigorous differential privacy guarantees.
Problem

Research questions and friction points this paper is trying to address.

Enhancing privacy-utility tradeoff in decentralized synthetic data generation
Mitigating utility degradation from noise in federated differential privacy
Enabling client collaboration with correlated noise for privacy preservation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Correlated noise for privacy enhancement
Federated DP-CDA with CAPE protocol
Anti-correlated noise cancellation technique
🔎 Similar Papers
No similar papers found.