Differentially Private Conformal Prediction via Quantile Binary Search

📅 2025-07-15
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing differentially private (DP) methods primarily focus on privacy protection during model training, overlooking privacy leakage risks arising from calibration data in uncertainty quantification frameworks such as conformal prediction (CP). This work is the first to integrate DP into the CP calibration stage, proposing P-COQS—a privacy-preserving, calibration-aware prediction framework. P-COQS employs randomized binary search and noisy quantile estimation to achieve rigorous $(varepsilon,delta)$-DP guarantees under finite samples while preserving near-target coverage. Theoretical analysis characterizes the fundamental trade-off among privacy budget, coverage accuracy, and prediction set size. Extensive experiments on CIFAR-10, ImageNet, and CoronaHack demonstrate that P-COQS significantly outperforms baseline methods: it yields substantially more compact and informative prediction sets without compromising coverage, thereby jointly ensuring privacy, statistical validity, and practical utility.

Technology Category

Application Category

📝 Abstract
Most Differentially Private (DP) approaches focus on limiting privacy leakage from learners based on the data that they are trained on, there are fewer approaches that consider leakage when procedures involve a calibration dataset which is common in uncertainty quantification methods such as Conformal Prediction (CP). Since there is a limited amount of approaches in this direction, in this work we deliver a general DP approach for CP that we call Private Conformity via Quantile Search (P-COQS). The proposed approach adapts an existing randomized binary search algorithm for computing DP quantiles in the calibration phase of CP thereby guaranteeing privacy of the consequent prediction sets. This however comes at a price of slightly under-covering with respect to the desired $(1 - α)$-level when using finite-sample calibration sets (although broad empirical results show that the P-COQS generally targets the required level in the considered cases). Confirming properties of the adapted algorithm and quantifying the approximate coverage guarantees of the consequent CP, we conduct extensive experiments to examine the effects of privacy noise, sample size and significance level on the performance of our approach compared to existing alternatives. In addition, we empirically evaluate our approach on several benchmark datasets, including CIFAR-10, ImageNet and CoronaHack. Our results suggest that the proposed method is robust to privacy noise and performs favorably with respect to the current DP alternative in terms of empirical coverage, efficiency, and informativeness. Specifically, the results indicate that P-COQS produces smaller conformal prediction sets while simultaneously targeting the desired coverage and privacy guarantees in all these experimental settings.
Problem

Research questions and friction points this paper is trying to address.

Ensuring privacy in Conformal Prediction calibration datasets
Addressing under-coverage in finite-sample DP quantile search
Comparing DP-CP methods on coverage, efficiency, and privacy
Innovation

Methods, ideas, or system contributions that make the work stand out.

DP approach for Conformal Prediction via quantile search
Adapts randomized binary search for DP quantiles
Ensures privacy with slightly under-covering prediction sets
🔎 Similar Papers
No similar papers found.
O
Ogonnaya M. Romanus
Department of Mathematics & Statistics, Auburn University, Auburn, AL 36849, USA
Roberto Molinari
Roberto Molinari
Assistant Professor, Auburn University
Statistics and Data Science