Training-Conditional Coverage Bounds under Covariate Shift

📅 2024-05-26
🏛️ arXiv.org
📈 Citations: 3
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the lack of theoretical guarantees on training-conditional coverage—i.e., coverage under the training data distribution—of conformal prediction under covariate shift. We first systematically investigate its upper-bound characterization and controllability. We derive a weighted Dvoretzky–Kiefer–Wolfowitz inequality to establish tight, provable training-conditional coverage bounds for split conformal prediction under nearly assumption-free conditions. Furthermore, leveraging algorithmic uniform stability, we provide the first training-conditional coverage guarantees for full conformal and jackknife+ methods. Our results demonstrate that all three mainstream conformal prediction frameworks achieve controllable training-conditional coverage under covariate shift, with split conformal yielding bounds that are both minimally assumption-dependent and tight. This work fills a critical theoretical gap in conditional coverage analysis of conformal prediction beyond the i.i.d. setting.

Technology Category

Application Category

📝 Abstract
Training-conditional coverage guarantees in conformal prediction concern the concentration of the error distribution, conditional on the training data, below some nominal level. The conformal prediction methodology has recently been generalized to the covariate shift setting, namely, the covariate distribution changes between the training and test data. In this paper, we study the training-conditional coverage properties of a range of conformal prediction methods under covariate shift via a weighted version of the Dvoretzky-Kiefer-Wolfowitz (DKW) inequality tailored for distribution change. The result for the split conformal method is almost assumption-free, while the results for the full conformal and jackknife+ methods rely on strong assumptions including the uniform stability of the training algorithm.
Problem

Research questions and friction points this paper is trying to address.

Extends conformal prediction to covariate shift scenarios
Analyzes training-conditional coverage bounds for prediction sets
Quantifies impact of distributional changes on prediction quality
Innovation

Methods, ideas, or system contributions that make the work stand out.

Extends conformal prediction to covariate shift
Derives training-conditional coverage PAC bounds
Quantifies prediction set quality vs distribution changes
🔎 Similar Papers
No similar papers found.
Mehrdad Pournaderi
Mehrdad Pournaderi
University of Utah
statisticssignal processing
Y
Yu Xiang
Department of Electrical and Computer Engineering, University of Utah