Equivalence Testing Under Privacy Constraints

πŸ“… 2026-04-07
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
This study addresses the risk of individual privacy leakage in standard equivalence testing, particularly in high-sensitivity domains such as healthcare. It introduces, for the first time, a unified differentially private two one-sided tests (DP-TOST) framework that enables privacy-preserving inference for both means and proportions. To handle the intractable sampling distributions in small-sample settings, the method employs simulation-based calibration, while integrating careful privacy budget allocation with the TOST paradigm to rigorously control Type I error rates under strong differential privacy guarantees. Theoretical analysis and extensive experiments demonstrate that, under reasonable privacy budgets or sample sizes, the proposed approach achieves statistical power nearly matching that of non-private benchmarks. Its reliability and practical utility are further validated through both simulated data and real-world medical datasets.
πŸ“ Abstract
Protecting individual privacy is essential across research domains, from socio-economic surveys to big-tech user data. This need is particularly acute in healthcare, where analyses often involve sensitive patient information. A typical example is comparing treatment efficacy across hospitals or ensuring consistency in diagnostic laboratory calibrations, both requiring privacy-preserving statistical procedures. However, standard equivalence testing procedures for differences in proportions or means, commonly used to assess average equivalence, can inadvertently disclose sensitive information. To address this problem, we develop differentially private equivalence testing procedures that rely on simulation-based calibration, as the finite-sample distribution is analytically intractable. Our approach introduces a unified framework, termed DP-TOST, for conducting differentially private equivalence testing of both means and proportions. Through numerical simulations and real-world applications, we demonstrate that the proposed method maintains type-I error control at the nominal level and achieves power comparable to its non-private counterpart as the privacy budget and/or sample size increases, while ensuring strong privacy guarantees. These findings establish a reliable and practical framework for privacy-preserving equivalence testing in high-stakes fields such as healthcare, among others.
Problem

Research questions and friction points this paper is trying to address.

equivalence testing
differential privacy
privacy-preserving statistics
treatment efficacy comparison
sensitive data analysis
Innovation

Methods, ideas, or system contributions that make the work stand out.

differential privacy
equivalence testing
simulation-based calibration
DP-TOST
privacy-preserving statistics
πŸ”Ž Similar Papers
No similar papers found.
Savita Pareek
Savita Pareek
Postdoctoral Research Fellow, Auburn University
StatisticsData ScienceBiostatisticsEconometrics
L
Luca Insolia
School of Pharmaceutical Sciences, University of Geneva
Roberto Molinari
Roberto Molinari
Assistant Professor, Auburn University
Statistics and Data Science
S
StΓ©phane Guerrier
School of Pharmaceutical Sciences, University of Geneva