Optimal Differentially Private Sampling of Unbounded Gaussians

📅 2025-03-03
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses the open problem posed by Ghazi et al. on efficiently sampling from an unknown-parameter, unbounded Gaussian distribution under (ε,δ)-differential privacy. Method: We propose a novel framework integrating privacy amplification by iteration, robust mean and covariance estimation, and adaptive truncation—specifically designed to mitigate the privacy-accuracy trade-off inherent in unbounded data. Contribution/Results: Our algorithm is the first to achieve private Gaussian sampling with only Õ(d) samples, attaining optimal sample complexity. It improves upon the prior best result by a factor of Ω(d), thereby fully resolving the open problem. Theoretical analysis establishes strictly tighter error bounds and stronger privacy guarantees than all existing methods, with rigorous (ε,δ)-DP compliance throughout.

Technology Category

Application Category

📝 Abstract
We provide the first $widetilde{mathcal{O}}left(d ight)$-sample algorithm for sampling from unbounded Gaussian distributions under the constraint of $left(varepsilon, delta ight)$-differential privacy. This is a quadratic improvement over previous results for the same problem, settling an open question of Ghazi, Hu, Kumar, and Manurangsi.
Problem

Research questions and friction points this paper is trying to address.

Develops an efficient algorithm for sampling unbounded Gaussians
Ensures differential privacy with $(varepsilon, delta)$ constraints
Improves sample complexity to $widetilde{mathcal{O}}(d)$
Innovation

Methods, ideas, or system contributions that make the work stand out.

First algorithm for unbounded Gaussian sampling
Achieves differential privacy with fewer samples
Quadratic improvement over previous methods
🔎 Similar Papers