🤖 AI Summary
Existing trajectory forecasting methods often produce uncalibrated trajectory ensembles, lacking theoretical guarantees on coverage—especially under distribution shifts. Method: This paper proposes an online adaptive calibration framework grounded in conformal prediction. It introduces a dynamic online update mechanism and an optimization strategy that explicitly models inter-step temporal dependencies, yielding nonstationary, time-varying prediction intervals that capture time-dependent uncertainty. Unlike static calibration, the framework adapts in real time to distributional shifts without retraining. Results: Evaluated across autonomous driving, hurricane track forecasting, and epidemic spread prediction, the method achieves strict finite-sample coverage (e.g., 90%) while significantly improving interval sharpness (average 12.7% reduction in interval width) and forecast accuracy. To our knowledge, this is the first trajectory uncertainty calibration framework unifying conformal prediction, multi-source trajectory ensemble integration, online adaptation, and temporal dependency modeling.
📝 Abstract
Future trajectories play an important role across domains such as autonomous driving, hurricane forecasting, and epidemic modeling, where practitioners commonly generate ensemble paths by sampling probabilistic models or leveraging multiple autoregressive predictors. While these trajectories reflect inherent uncertainty, they are typically uncalibrated. We propose a unified framework based on conformal prediction that transforms sampled trajectories into calibrated prediction intervals with theoretical coverage guarantees. By introducing a novel online update step and an optimization step that captures inter-step dependencies, our method can produce discontinuous prediction intervals around each trajectory, naturally capture temporal dependencies, and yield sharper, more adaptive uncertainty estimates.