Rectifying Conformity Scores for Better Conditional Coverage

📅 2025-02-22
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses the problem of insufficient conditional coverage in split conformal prediction. We propose a trainable score calibration method: within the split conformal framework, an arbitrary base conformal score is adaptively transformed via neural-network-parameterized differentiable quantile regression. Our method rigorously guarantees marginal coverage and—crucially—establishes, for the first time, a theoretical bound linking estimation error to conditional coverage, thereby substantially improving local conditional coverage reliability. Unlike conventional conformal quantile regression, our approach overcomes inherent limitations in multi-output settings. Empirical evaluation on multi-class, multi-output tasks demonstrates significant gains in conditional coverage, while preserving strong local adaptivity and statistical robustness.

Technology Category

Application Category

📝 Abstract
We present a new method for generating confidence sets within the split conformal prediction framework. Our method performs a trainable transformation of any given conformity score to improve conditional coverage while ensuring exact marginal coverage. The transformation is based on an estimate of the conditional quantile of conformity scores. The resulting method is particularly beneficial for constructing adaptive confidence sets in multi-output problems where standard conformal quantile regression approaches have limited applicability. We develop a theoretical bound that captures the influence of the accuracy of the quantile estimate on the approximate conditional validity, unlike classical bounds for conformal prediction methods that only offer marginal coverage. We experimentally show that our method is highly adaptive to the local data structure and outperforms existing methods in terms of conditional coverage, improving the reliability of statistical inference in various applications.
Problem

Research questions and friction points this paper is trying to address.

Improving conditional coverage in confidence sets
Adaptive confidence sets for multi-output problems
Enhancing reliability of statistical inference
Innovation

Methods, ideas, or system contributions that make the work stand out.

Trainable conformity score transformation
Ensures exact marginal coverage
Improves conditional coverage adaptively
🔎 Similar Papers
No similar papers found.