🤖 AI Summary
3D anomaly detection in autonomous driving is hindered by the scarcity of high-quality, multimodal benchmark datasets. To address this, we introduce STU—the first publicly available 3D anomaly segmentation dataset tailored for driving scenarios—comprising thousands of temporally synchronized LiDAR-camera frames supporting long-, mid-, and short-range anomaly detection. STU uniquely provides fine-grained, point-wise 3D semantic annotations with temporal consistency and cross-modal alignment, enabling precise anomaly localization and modeling. We further propose a unified evaluation framework compatible with both voxel- and point-based 3D segmentation models. We open-source the STU dataset and evaluation code, and conduct a comprehensive benchmark of state-of-the-art 3D segmentation methods, revealing critical performance bottlenecks and actionable optimization directions. STU fills a fundamental gap in data and evaluation infrastructure for 3D anomaly segmentation in autonomous driving.
📝 Abstract
To operate safely, autonomous vehicles (AVs) need to detect and handle unexpected objects or anomalies on the road. While significant research exists for anomaly detection and segmentation in 2D, research progress in 3D is underexplored. Existing datasets lack high-quality multimodal data that are typically found in AVs. This paper presents a novel dataset for anomaly segmentation in driving scenarios. To the best of our knowledge, it is the first publicly available dataset focused on road anomaly segmentation with dense 3D semantic labeling, incorporating both LiDAR and camera data, as well as sequential information to enable anomaly detection across various ranges. This capability is critical for the safe navigation of autonomous vehicles. We adapted and evaluated several baseline models for 3D segmentation, highlighting the challenges of 3D anomaly detection in driving environments. Our dataset and evaluation code will be openly available, facilitating the testing and performance comparison of different approaches.