Optimal Transport-based Conformal Prediction

📅 2025-01-31
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Conventional black-box uncertainty quantification methods for multi-output regression and classification are constrained by scalar scoring functions and convex prediction sets, limiting their ability to capture complex geometric structures in multivariate output spaces. Method: We propose a geometrically adaptive conformal prediction framework grounded in optimal transport theory. This is the first work to introduce Monge–Kantorovich vector ranks and vector quantiles into conformal prediction, enabling distribution-free, finite-sample valid (conditional) coverage guarantees for non-convex, shape-flexible prediction sets. The method integrates optimal transport, vector rank statistics, and multi-output adaptation mechanisms. Contribution/Results: Our framework preserves rigorous theoretical guarantees while significantly improving predictive efficiency and empirical coverage. Extensive experiments demonstrate superior performance and robustness over state-of-the-art baselines on diverse multi-target regression and multi-class classification benchmarks.

Technology Category

Application Category

📝 Abstract
Conformal Prediction (CP) is a principled framework for quantifying uncertainty in blackbox learning models, by constructing prediction sets with finite-sample coverage guarantees. Traditional approaches rely on scalar nonconformity scores, which fail to fully exploit the geometric structure of multivariate outputs, such as in multi-output regression or multiclass classification. Recent methods addressing this limitation impose predefined convex shapes for the prediction sets, potentially misaligning with the intrinsic data geometry. We introduce a novel CP procedure handling multivariate score functions through the lens of optimal transport. Specifically, we leverage Monge-Kantorovich vector ranks and quantiles to construct prediction region with flexible, potentially non-convex shapes, better suited to the complex uncertainty patterns encountered in multivariate learning tasks. We prove that our approach ensures finite-sample, distribution-free coverage properties, similar to typical CP methods. We then adapt our method for multi-output regression and multiclass classification, and also propose simple adjustments to generate adaptive prediction regions with asymptotic conditional coverage guarantees. Finally, we evaluate our method on practical regression and classification problems, illustrating its advantages in terms of (conditional) coverage and efficiency.
Problem

Research questions and friction points this paper is trying to address.

Multi-output Prediction
Uncertainty Estimation
Black-box Models
Innovation

Methods, ideas, or system contributions that make the work stand out.

Multi-output Conformal Prediction
Optimal Transport
Monge-Kantorovich vector quantiles
🔎 Similar Papers
No similar papers found.