AdaptNC: Adaptive Nonconformity Scores for Uncertainty-Aware Autonomous Systems in Dynamic Environments

📅 2026-02-02
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In dynamic environments, distribution shifts often cause conventional conformal prediction methods to produce overly conservative and volumetrically inefficient uncertainty regions, struggling to balance coverage accuracy with tightness. To address this, this work proposes AdaptNC, a framework that, for the first time, enables joint online adaptation of nonconformity scoring functions and conformal thresholds. AdaptNC employs an adaptive reweighting mechanism to optimize the scoring function and incorporates a replay buffer strategy to mitigate coverage fluctuations induced by score switching. Evaluated on robotic benchmarks involving multi-agent policy switches, abrupt environmental changes, and sensor degradation, AdaptNC significantly reduces prediction region volume while rigorously maintaining the target coverage, outperforming existing approaches that adapt only the threshold.

Technology Category

Application Category

📝 Abstract
Rigorous uncertainty quantification is essential for the safe deployment of autonomous systems in unconstrained environments. Conformal Prediction (CP) provides a distribution-free framework for this task, yet its standard formulations rely on exchangeability assumptions that are violated by the distribution shifts inherent in real-world robotics. Existing online CP methods maintain target coverage by adaptively scaling the conformal threshold, but typically employ a static nonconformity score function. We show that this fixed geometry leads to highly conservative, volume-inefficient prediction regions when environments undergo structural shifts. To address this, we propose \textbf{AdaptNC}, a framework for the joint online adaptation of both the nonconformity score parameters and the conformal threshold. AdaptNC leverages an adaptive reweighting scheme to optimize score functions, and introduces a replay buffer mechanism to mitigate the coverage instability that occurs during score transitions. We evaluate AdaptNC on diverse robotic benchmarks involving multi-agent policy changes, environmental changes and sensor degradation. Our results demonstrate that AdaptNC significantly reduces prediction region volume compared to state-of-the-art threshold-only baselines while maintaining target coverage levels.
Problem

Research questions and friction points this paper is trying to address.

Conformal Prediction
distribution shifts
nonconformity score
autonomous systems
uncertainty quantification
Innovation

Methods, ideas, or system contributions that make the work stand out.

Adaptive Nonconformity Scores
Conformal Prediction
Online Adaptation
Distribution Shift
Uncertainty Quantification
🔎 Similar Papers
No similar papers found.
R
Renukanandan Tumu
Department of Electrical and Systems Engineering, University of Pennsylvania, Philadelphia, Pennsylvania
A
Aditya Singh
Department of Electrical and Systems Engineering, University of Pennsylvania, Philadelphia, Pennsylvania
Rahul Mangharam
Rahul Mangharam
Professor of Electrical Engineering and Computer Science, University of Pennsylvania
Safe Autonomous SystemsCyber-Physical SystemsMedical Devices