Relevance-Aware Thresholding in Online Conformal Prediction for Time Series

📅 2025-10-03
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing online conformal prediction (OCP) methods update the conformity threshold solely based on binary coverage signals, neglecting the continuous relationship between prediction intervals and true values—leading to abrupt threshold adjustments and overly conservative intervals. This work proposes a correlation-aware OCP framework that replaces discrete coverage feedback with a differentiable, bounded correlation scoring function (e.g., normalized distance or calibrated likelihood), enabling smooth, adaptive threshold updates. The method dynamically calibrates prediction intervals under nonstationary time-series distributions while strictly maintaining the target coverage level. Experiments demonstrate that, compared to state-of-the-art OCP approaches, our method reduces average interval width by 12.7%–28.4%, while keeping coverage deviation below 0.5%. This significantly improves the precision–reliability trade-off in uncertainty quantification.

Technology Category

Application Category

📝 Abstract
Uncertainty quantification has received considerable interest in recent works in Machine Learning. In particular, Conformal Prediction (CP) gains ground in this field. For the case of time series, Online Conformal Prediction (OCP) becomes an option to address the problem of data distribution shift over time. Indeed, the idea of OCP is to update a threshold of some quantity (whether the miscoverage level or the quantile) based on the distribution observation. To evaluate the performance of OCP methods, two key aspects are typically considered: the coverage validity and the prediction interval width minimization. Recently, new OCP methods have emerged, offering long-run coverage guarantees and producing more informative intervals. However, during the threshold update step, most of these methods focus solely on the validity of the prediction intervals~--~that is, whether the ground truth falls inside or outside the interval~--~without accounting for their relevance. In this paper, we aim to leverage this overlooked aspect. Specifically, we propose enhancing the threshold update step by replacing the binary evaluation (inside/outside) with a broader class of functions that quantify the relevance of the prediction interval using the ground truth. This approach helps prevent abrupt threshold changes, potentially resulting in narrower prediction intervals. Indeed, experimental results on real-world datasets suggest that these functions can produce tighter intervals compared to existing OCP methods while maintaining coverage validity.
Problem

Research questions and friction points this paper is trying to address.

Enhancing threshold updates in online conformal prediction
Quantifying prediction interval relevance using ground truth
Achieving narrower intervals while maintaining coverage validity
Innovation

Methods, ideas, or system contributions that make the work stand out.

Replaces binary evaluation with relevance functions
Prevents abrupt threshold changes in updates
Produces narrower prediction intervals while valid
🔎 Similar Papers
No similar papers found.
T
Théo Dupuy
EuroMov Digital Health in Motion, Univ Montpellier, IMT Mines Ales, Ales, France
Binbin Xu
Binbin Xu
HUAWEI Noah's Ark Lab
SLAMRoboticsComputer Vision
S
Stéphane Perrey
EuroMov Digital Health in Motion, Univ Montpellier, IMT Mines Ales, Montpellier, France
J
Jacky Montmain
EuroMov Digital Health in Motion, Univ Montpellier, IMT Mines Ales, Ales, France
A
Abdelhak Imoussaten
EuroMov Digital Health in Motion, Univ Montpellier, IMT Mines Ales, Ales, France