4Seasons: A Cross-Season Dataset for Multi-Weather SLAM in Autonomous Driving

📅 2020-09-14
🏛️ German Conference on Pattern Recognition
📈 Citations: 80
Influential: 4
📄 PDF
🤖 AI Summary
To address the insufficient robustness of visual odometry and relocalization in autonomous driving under cross-seasonal and multi-weather conditions (rain, snow, fog, night), this paper introduces the first open-source benchmark dataset. It covers nine diverse场景—including urban, highway, tunnel, and garage environments—spanning 350 km and encompassing all four seasons as well as extreme illumination and meteorological conditions. Ground-truth poses are generated at centimeter-level global consistency via tightly coupled stereo visual-inertial odometry (VIO) and RTK-GNSS fusion. Rigorous multi-sensor temporal synchronization, joint calibration, and cross-scene pose optimization ensure accuracy consistency across conditions. This dataset provides a rigorous, authoritative benchmark for evaluating and validating visual SLAM algorithms in complex real-world environments, significantly advancing research on all-weather, cross-seasonally robust localization.
📝 Abstract
We present a novel dataset covering seasonal and challenging perceptual conditions for autonomous driving. Among others, it enables research on visual odometry, global place recognition, and map-based re-localization tracking. The data was collected in different scenarios and under a wide variety of weather conditions and illuminations, including day and night. This resulted in more than 350 km of recordings in nine different environments ranging from multi-level parking garage over urban (including tunnels) to countryside and highway. We provide globally consistent reference poses with up-to centimeter accuracy obtained from the fusion of direct stereo visual-inertial odometry with RTK-GNSS. The full dataset is available at https://www.4seasons-dataset.com.
Problem

Research questions and friction points this paper is trying to address.

Dataset for multi-weather SLAM in autonomous driving
Enables visual odometry and global place recognition
Provides centimeter-accurate poses in diverse environments
Innovation

Methods, ideas, or system contributions that make the work stand out.

Multi-weather SLAM dataset for autonomous driving
Visual-inertial odometry fused with RTK-GNSS
Global reference poses with centimeter accuracy
🔎 Similar Papers
No similar papers found.
P
Patrick Wenzel
Technical University of Munich and Artisense
R
Rui Wang
Technical University of Munich and Artisense
N
Nan Yang
Technical University of Munich and Artisense
Q
Qing Cheng
Artisense
Q
Qadeer Ahmad Khan
Technical University of Munich and Artisense
L
L. Stumberg
Technical University of Munich and Artisense
Niclas Zeller
Niclas Zeller
Karlsruhe University of Applied Sciences
Computer VisionSLAM3D ReconstructionLight Field Imaging
Daniel Cremers
Daniel Cremers
Technical University of Munich
Computer VisionMachine LearningOptimizationRobotics