Tighter Privacy Analysis for Truncated Poisson Sampling

📅 2025-08-20
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the privacy amplification effect of truncated Poisson sampling—where batches exceeding a capacity threshold are truncated—in differentially private learning. Unlike conventional analyses assuming ideal Poisson sampling, this paper provides the first rigorous probabilistic modeling and privacy analysis for the truncated setting, establishing a theoretical framework aligned with practical training scenarios (e.g., gradient clipping combined with batch truncation). Methodologically, it precisely characterizes the sampling distribution shift induced by truncation and derives a tighter upper bound on the Rényi differential privacy (RDP) loss. The resulting bound strictly improves upon existing RDP bounds under identical privacy budgets. Empirically, experiments on standard benchmarks validate both the tightness of the theoretical bound and the improved model utility—demonstrating significant gains in accuracy or convergence speed without compromising privacy guarantees.

Technology Category

Application Category

📝 Abstract
We give a new privacy amplification analysis for truncated Poisson sampling, a Poisson sampling variant that truncates a batch if it exceeds a given maximum batch size.
Problem

Research questions and friction points this paper is trying to address.

Analyzing privacy amplification for truncated Poisson sampling
Addressing privacy in Poisson sampling with batch truncation
Improving privacy analysis for bounded batch size sampling
Innovation

Methods, ideas, or system contributions that make the work stand out.

Truncated Poisson sampling technique
Privacy amplification analysis method
Maximum batch size truncation mechanism
🔎 Similar Papers
No similar papers found.