Building Real-time Awareness of Out-of-distribution in Trajectory Prediction for Autonomous Vehicles

📅 2024-09-25
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address unreliable trajectory predictions in autonomous driving caused by train-test distribution shifts—particularly in rare out-of-distribution (OOD) scenarios such as roundabouts and sudden braking—this paper pioneers modeling OOD detection as a real-time changepoint detection task. We propose a lightweight, time-series-adaptive online anomaly-aware framework that jointly leverages temporal modeling of prediction residuals, comparative analysis of in-distribution versus OOD error patterns, and sliding-window statistical inference to enable dynamic OOD identification at any prediction timestep. Evaluated across multiple real-world benchmarks, our method achieves a 27.3% improvement in F1-score, maintains a false positive rate below 1.2%, incurs less than 3 ms latency per frame, and is fully compatible with mainstream trajectory prediction models.

Technology Category

Application Category

📝 Abstract
Accurate trajectory prediction is essential for the safe operation of autonomous vehicles in real-world environments. Even well-trained machine learning models may produce unreliable predictions due to discrepancies between training data and real-world conditions encountered during inference. In particular, the training dataset tends to overrepresent common scenes (e.g., straight lanes) while underrepresenting less frequent ones (e.g., traffic circles). In addition, it often overlooks unpredictable real-world events such as sudden braking or falling objects. To ensure safety, it is critical to detect in real-time when a model's predictions become unreliable. Leveraging the intuition that in-distribution (ID) scenes exhibit error patterns similar to training data, while out-of-distribution (OOD) scenes do not, we introduce a principled, real-time approach for OOD detection by framing it as a change-point detection problem. We address the challenging settings where the OOD scenes are deceptive, meaning that they are not easily detectable by human intuitions. Our lightweight solutions can handle the occurrence of OOD at any time during trajectory prediction inference. Experimental results on multiple real-world datasets using a benchmark trajectory prediction model demonstrate the effectiveness of our methods.
Problem

Research questions and friction points this paper is trying to address.

Detects unreliable predictions in autonomous vehicle trajectory models
Identifies deceptive out-of-distribution scenes in real-time
Addresses data discrepancies between training and real-world conditions
Innovation

Methods, ideas, or system contributions that make the work stand out.

Real-time OOD detection in trajectory prediction
Change-point detection for unreliable predictions
Lightweight solution for deceptive OOD scenes
🔎 Similar Papers
No similar papers found.