🤖 AI Summary
Agricultural robots face significant challenges in localization, mapping, and navigation under natural illumination variations, motion blur, uneven terrain, and long-range visual aliasing—exacerbated by the absence of high-synchronization, ground-truth–annotated multimodal benchmark datasets. To address this, we present and publicly release the first high-precision, multimodal SLAM dataset specifically designed for soybean field environments. It integrates synchronized stereo infrared/RGB cameras, IMU, multi-mode GNSS, and wheel odometry, with hardware-level timestamp synchronization and post-processed differential GNSS to deliver centimeter-accurate 6-DOF ground-truth trajectories and long-distance loop closures. The dataset comprises over two hours of real-world field sequences. We systematically evaluate state-of-the-art multimodal SLAM methods, identifying critical performance bottlenecks. This work fills a key gap in agricultural SLAM evaluation, enabling reproducible algorithm development and standardized benchmarking.
📝 Abstract
We present a multi-modal dataset collected in a soybean crop field, comprising over two hours of recorded data from sensors such as stereo infrared camera, color camera, accelerometer, gyroscope, magnetometer, GNSS (Single Point Positioning, Real-Time Kinematic and Post-Processed Kinematic), and wheel odometry. This dataset captures key challenges inherent to robotics in agricultural environments, including variations in natural lighting, motion blur, rough terrain, and long, perceptually aliased sequences. By addressing these complexities, the dataset aims to support the development and benchmarking of advanced algorithms for localization, mapping, perception, and navigation in agricultural robotics. The platform and data collection system is designed to meet the key requirements for evaluating multi-modal SLAM systems, including hardware synchronization of sensors, 6-DOF ground truth and loops on long trajectories.
We run multimodal state-of-the art SLAM methods on the dataset, showcasing the existing limitations in their application on agricultural settings. The dataset and utilities to work with it are released on https://cifasis.github.io/rosariov2/.