Differentially Private E-Values

📅 2025-10-21
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses privacy leakage when using e-values for statistical inference and risk control under sensitive data scenarios. We propose the first general framework to transform any non-private e-value into a differentially private (DP) e-value, supporting arbitrary stopping times and post-hoc valid inference. Our core innovation is a bias-corrected multiplicative noise mechanism that preserves the strong statistical power of e-values under strict ε-DP guarantees, achieving asymptotic performance approaching that of the non-private counterpart. The method integrates DP mechanisms, online monitoring, conformal prediction, and hypothesis testing. Experiments on online risk monitoring, healthcare analytics, and conformal e-prediction demonstrate that our approach significantly outperforms existing DP-e-value methods—delivering both high statistical efficacy and rigorous privacy protection.

Technology Category

Application Category

📝 Abstract
E-values have gained prominence as flexible tools for statistical inference and risk control, enabling anytime- and post-hoc-valid procedures under minimal assumptions. However, many real-world applications fundamentally rely on sensitive data, which can be leaked through e-values. To ensure their safe release, we propose a general framework to transform non-private e-values into differentially private ones. Towards this end, we develop a novel biased multiplicative noise mechanism that ensures our e-values remain statistically valid. We show that our differentially private e-values attain strong statistical power, and are asymptotically as powerful as their non-private counterparts. Experiments across online risk monitoring, private healthcare, and conformal e-prediction demonstrate our approach's effectiveness and illustrate its broad applicability.
Problem

Research questions and friction points this paper is trying to address.

Preventing sensitive data leakage through e-values in statistical inference
Developing differentially private e-values using biased multiplicative noise
Maintaining statistical power while ensuring privacy across various applications
Innovation

Methods, ideas, or system contributions that make the work stand out.

Novel biased multiplicative noise mechanism for privacy
Transforms non-private e-values into differentially private
Ensures statistical validity while maintaining strong power
🔎 Similar Papers
No similar papers found.