High-Probability Bounds For Heterogeneous Local Differential Privacy

📅 2025-10-13
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper studies statistical estimation under local differential privacy (LDP) with heterogeneous privacy requirements—where users specify distinct privacy budgets—focusing on one- and multi-dimensional mean estimation and discrete distribution learning under the ℓ∞-distance, with high-probability error bounds (rather than expectation-based guarantees). Methodologically, it integrates customized privacy mechanism design, concentration inequalities, and information-theoretic lower bound analysis. The work establishes the first finite-sample upper bounds for heterogeneous LDP with explicit high-probability guarantees, and provides matching minimax lower bounds, thereby rigorously proving statistical optimality. Specifically, it achieves optimal high-probability ℓ₂-error bounds for mean estimation and high-probability ℓ∞-convergence for distribution learning. These results furnish a theoretical foundation and design principles for personalized LDP mechanisms.

Technology Category

Application Category

📝 Abstract
We study statistical estimation under local differential privacy (LDP) when users may hold heterogeneous privacy levels and accuracy must be guaranteed with high probability. Departing from the common in-expectation analyses, and for one-dimensional and multi-dimensional mean estimation problems, we develop finite sample upper bounds in $ell_2$-norm that hold with probability at least $1-β$. We complement these results with matching minimax lower bounds, establishing the optimality (up to constants) of our guarantees in the heterogeneous LDP regime. We further study distribution learning in $ell_infty$-distance, designing an algorithm with high-probability guarantees under heterogeneous privacy demands. Our techniques offer principled guidance for designing mechanisms in settings with user-specific privacy levels.
Problem

Research questions and friction points this paper is trying to address.

Develops high-probability bounds for heterogeneous local differential privacy
Establishes optimal guarantees for mean estimation under varying privacy levels
Provides distribution learning algorithms with user-specific privacy demands
Innovation

Methods, ideas, or system contributions that make the work stand out.

Developed finite sample bounds for mean estimation
Established matching minimax lower bounds for optimality
Designed distribution learning algorithm with privacy guarantees
🔎 Similar Papers
No similar papers found.