🤖 AI Summary
To address the insufficient robustness of visual odometry and relocalization in autonomous driving under cross-seasonal and multi-weather conditions (rain, snow, fog, night), this paper introduces the first open-source benchmark dataset. It covers nine diverse场景—including urban, highway, tunnel, and garage environments—spanning 350 km and encompassing all four seasons as well as extreme illumination and meteorological conditions. Ground-truth poses are generated at centimeter-level global consistency via tightly coupled stereo visual-inertial odometry (VIO) and RTK-GNSS fusion. Rigorous multi-sensor temporal synchronization, joint calibration, and cross-scene pose optimization ensure accuracy consistency across conditions. This dataset provides a rigorous, authoritative benchmark for evaluating and validating visual SLAM algorithms in complex real-world environments, significantly advancing research on all-weather, cross-seasonally robust localization.
📝 Abstract
We present a novel dataset covering seasonal and challenging perceptual conditions for autonomous driving. Among others, it enables research on visual odometry, global place recognition, and map-based re-localization tracking. The data was collected in different scenarios and under a wide variety of weather conditions and illuminations, including day and night. This resulted in more than 350 km of recordings in nine different environments ranging from multi-level parking garage over urban (including tunnels) to countryside and highway. We provide globally consistent reference poses with up-to centimeter accuracy obtained from the fusion of direct stereo visual-inertial odometry with RTK-GNSS. The full dataset is available at https://www.4seasons-dataset.com.