🤖 AI Summary
To address the robustness issue of online conformal prediction (OCP) under uniform label noise—where coverage deviates from the target level α—this paper proposes Noise-Robust Online Conformal Prediction (NR-OCP). The key innovation is a novel robust quantile loss that operates without access to true labels, enabling, for the first time, theoretical elimination of the coverage gap under both constant and dynamic learning rates. Theoretical analysis establishes that, under the uniform noise assumption, NR-OCP’s coverage error converges to zero at rate $O(T^{-1/2})$, thereby guaranteeing asymptotically exact long-term coverage. Empirical results demonstrate that NR-OCP consistently maintains the target coverage across diverse noise levels while yielding smaller average prediction set sizes, thereby improving both statistical efficiency and robustness.
📝 Abstract
Conformal prediction is an emerging technique for uncertainty quantification that constructs prediction sets guaranteed to contain the true label with a predefined probability. Recent work develops online conformal prediction methods that adaptively construct prediction sets to accommodate distribution shifts. However, existing algorithms typically assume perfect label accuracy which rarely holds in practice. In this work, we investigate the robustness of online conformal prediction under uniform label noise with a known noise rate, in both constant and dynamic learning rate schedules. We show that label noise causes a persistent gap between the actual mis-coverage rate and the desired rate $alpha$, leading to either overestimated or underestimated coverage guarantees. To address this issue, we propose Noise Robust Online Conformal Prediction (dubbed NR-OCP) by updating the threshold with a novel robust pinball los}, which provides an unbiased estimate of clean pinball loss without requiring ground-truth labels. Our theoretical analysis shows that NR-OCP eliminates the coverage gap in both constant and dynamic learning rate schedules, achieving a convergence rate of $mathcal{O}(T^{-1/2})$ for both empirical and expected coverage errors under uniform label noise. Extensive experiments demonstrate the effectiveness of our method by achieving both precise coverage and improved efficiency.