🤖 AI Summary
Existing autonomous driving datasets lack support for controlled, parameterized, and reproducible degradation across multimodal sensors (cameras, radar, LiDAR), hindering systematic robustness evaluation of perception fusion models under partial sensor failure or occlusion. To address this, we introduce Occluded nuScenes—the first nuScenes-based dataset enabling controllable multimodal degradation. We propose four image-level camera occlusion types—including two novel occlusion patterns—and develop parameterized simulation scripts for radar and LiDAR that model three physically plausible degradation modes each. The dataset is released in both full and mini versions, facilitating comparable and reproducible evaluation under adverse conditions. Occluded nuScenes bridges a critical gap in robustness testing by providing the first benchmark with systematically controlled, sensor-specific, and physically grounded multimodal degradations—enabling rigorous assessment of fusion model resilience to realistic sensor impairments.
📝 Abstract
Robust perception in automated driving requires reliable performance under adverse conditions, where sensors may be affected by partial failures or environmental occlusions. Although existing autonomous driving datasets inherently contain sensor noise and environmental variability, very few enable controlled, parameterised, and reproducible degradations across multiple sensing modalities. This gap limits the ability to systematically evaluate how perception and fusion architectures perform under well-defined adverse conditions. To address this limitation, we introduce the Occluded nuScenes Dataset, a novel extension of the widely used nuScenes benchmark. For the camera modality, we release both the full and mini versions with four types of occlusions, two adapted from public implementations and two newly designed. For radar and LiDAR, we provide parameterised occlusion scripts that implement three types of degradations each, enabling flexible and repeatable generation of occluded data. This resource supports consistent, reproducible evaluation of perception models under partial sensor failures and environmental interference. By releasing the first multi-sensor occlusion dataset with controlled and reproducible degradations, we aim to advance research on robust sensor fusion, resilience analysis, and safety-critical perception in automated driving.