SpeedCP: Fast Kernel-based Conditional Conformal Prediction

📅 2025-09-28
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the high computational cost and poor scalability of full solution-path computation in kernel-based conditional conformal prediction within reproducing kernel Hilbert spaces (RKHS), this paper proposes a kernel-driven fast conditional conformal prediction method. Our approach introduces three key contributions: (1) a stable and efficient joint path-following algorithm for regularization parameter and smoothness selection, enabling data-adaptive calibration; (2) low-rank latent variable embedding to overcome scalability bottlenecks inherent in high-dimensional kernel methods; and (3) approximate conditional coverage intervals constructed via kernel quantile regression, ensuring rigorous finite-sample theoretical guarantees. Empirical evaluations demonstrate that the method achieves reliable conditional coverage across diverse black-box predictors, reducing average prediction interval width by 30% and accelerating computation by up to 40× compared to baseline approaches.

Technology Category

Application Category

📝 Abstract
Conformal prediction provides distribution-free prediction sets with finite-sample conditional guarantees. We build upon the RKHS-based framework of Gibbs et al. (2023), which leverages families of covariate shifts to provide approximate conditional conformal prediction intervals, an approach with strong theoretical promise, but with prohibitive computational cost. To bridge this gap, we develop a stable and efficient algorithm that computes the full solution path of the regularized RKHS conformal optimization problem, at essentially the same cost as a single kernel quantile fit. Our path-tracing framework simultaneously tunes hyperparameters, providing smoothness control and data-adaptive calibration. To extend the method to high-dimensional settings, we further integrate our approach with low-rank latent embeddings that capture conditional validity in a data-driven latent space. Empirically, our method provides reliable conditional coverage across a variety of modern black-box predictors, improving the interval length of Gibbs et al. (2023) by 30%, while achieving a 40-fold speedup.
Problem

Research questions and friction points this paper is trying to address.

Developing fast algorithm for kernel-based conditional conformal prediction
Reducing computational cost while maintaining conditional coverage guarantees
Extending method to high-dimensional settings via latent embeddings
Innovation

Methods, ideas, or system contributions that make the work stand out.

Efficient algorithm computes full solution path
Simultaneously tunes hyperparameters for smoothness control
Integrates low-rank latent embeddings for high dimensions
🔎 Similar Papers
No similar papers found.
Y
Yeo Jin Jung
Department of Statistics, University of Chicago
Y
Yating Liu
Department of Statistics, University of Chicago
S
So Won Jeong
Booth Business School, University of Chicago
Zixuan Wu
Zixuan Wu
Georgia Institute of Technology
Robotics
Claire Donnat
Claire Donnat
University of Chicago
Statisticsgraphsbiomedical data analysislatent variable modelsbrain connectomics