🤖 AI Summary
To address the insufficient robustness of SLAM systems under challenging conditions—such as adverse weather, low illumination, and uneven terrain—this work introduces the first multimodal SLAM benchmark dataset specifically designed for extreme operational scenarios. Methodologically, it pioneers the systematic integration of 4D millimeter-wave radar with infrared and depth cameras, alongside 3D LiDAR, RGB, and GPS/IMU sensors—totaling ten modalities—and proposes novel techniques for multi-sensor spatiotemporal synchronization, 4D radar point cloud decoding and alignment, cross-modal registration among infrared, depth, and LiDAR data, and high-precision ground-truth generation via tightly coupled GPS/INS fusion. The dataset covers previously underrepresented scenarios—including snowfall, rainy nights, and gravel roads—and comprises 18.5 km of traverses, 69 minutes of synchronized recordings, and 660 GB of open-source data. This resource significantly advances rigorous evaluation and development of robust SLAM algorithms under extreme conditions.
📝 Abstract
Adverse weather conditions, low-light environments, and bumpy road surfaces pose significant challenges to SLAM in robotic navigation and autonomous driving. Existing datasets in this field predominantly rely on single sensors or combinations of LiDAR, cameras, and IMUs. However, 4D millimeter-wave radar demonstrates robustness in adverse weather, infrared cameras excel in capturing details under low-light conditions, and depth images provide richer spatial information. Multi-sensor fusion methods also show potential for better adaptation to bumpy roads. Despite some SLAM studies incorporating these sensors and conditions, there remains a lack of comprehensive datasets addressing low-light environments and bumpy road conditions, or featuring a sufficiently diverse range of sensor data. In this study, we introduce a multi-sensor dataset covering challenging scenarios such as snowy weather, rainy weather, nighttime conditions, speed bumps, and rough terrains. The dataset includes rarely utilized sensors for extreme conditions, such as 4D millimeter-wave radar, infrared cameras, and depth cameras, alongside 3D LiDAR, RGB cameras, GPS, and IMU. It supports both autonomous driving and ground robot applications and provides reliable GPS/INS ground truth data, covering structured and semi-structured terrains. We evaluated various SLAM algorithms using this dataset, including RGB images, infrared images, depth images, LiDAR, and 4D millimeter-wave radar. The dataset spans a total of 18.5 km, 69 minutes, and approximately 660 GB, offering a valuable resource for advancing SLAM research under complex and extreme conditions. Our dataset is available at https://github.com/GongWeiSheng/DIDLM.