The Rosario Dataset v2: Multimodal Dataset for Agricultural Robotics

📅 2025-08-29
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Agricultural robots face significant challenges in localization, mapping, and navigation under natural illumination variations, motion blur, uneven terrain, and long-range visual aliasing—exacerbated by the absence of high-synchronization, ground-truth–annotated multimodal benchmark datasets. To address this, we present and publicly release the first high-precision, multimodal SLAM dataset specifically designed for soybean field environments. It integrates synchronized stereo infrared/RGB cameras, IMU, multi-mode GNSS, and wheel odometry, with hardware-level timestamp synchronization and post-processed differential GNSS to deliver centimeter-accurate 6-DOF ground-truth trajectories and long-distance loop closures. The dataset comprises over two hours of real-world field sequences. We systematically evaluate state-of-the-art multimodal SLAM methods, identifying critical performance bottlenecks. This work fills a key gap in agricultural SLAM evaluation, enabling reproducible algorithm development and standardized benchmarking.

Technology Category

Application Category

📝 Abstract
We present a multi-modal dataset collected in a soybean crop field, comprising over two hours of recorded data from sensors such as stereo infrared camera, color camera, accelerometer, gyroscope, magnetometer, GNSS (Single Point Positioning, Real-Time Kinematic and Post-Processed Kinematic), and wheel odometry. This dataset captures key challenges inherent to robotics in agricultural environments, including variations in natural lighting, motion blur, rough terrain, and long, perceptually aliased sequences. By addressing these complexities, the dataset aims to support the development and benchmarking of advanced algorithms for localization, mapping, perception, and navigation in agricultural robotics. The platform and data collection system is designed to meet the key requirements for evaluating multi-modal SLAM systems, including hardware synchronization of sensors, 6-DOF ground truth and loops on long trajectories. We run multimodal state-of-the art SLAM methods on the dataset, showcasing the existing limitations in their application on agricultural settings. The dataset and utilities to work with it are released on https://cifasis.github.io/rosariov2/.
Problem

Research questions and friction points this paper is trying to address.

Addresses agricultural robotics challenges like lighting variations and rough terrain
Supports development of localization, mapping, perception, and navigation algorithms
Provides multi-modal synchronized sensor data with ground truth for SLAM evaluation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Multi-sensor agricultural dataset collection
Hardware-synchronized SLAM evaluation system
Long-trajectory ground truth provision
🔎 Similar Papers
No similar papers found.
N
Nicolás Soncini
CIFASIS (CONICET -UNR), Rosario, Santa Fe, Argentina
J
Javier Cremona
CIFASIS (CONICET -UNR), Rosario, Santa Fe, Argentina
E
Erica Vidal
CIFASIS (CONICET -UNR), Rosario, Santa Fe, Argentina
M
Maximiliano García
CIFASIS (CONICET -UNR), Rosario, Santa Fe, Argentina
G
Gastón Castro
Universidad de San Andrés (UDESA-CONICET), Buenos Aires, Argentina
Taihú Pire
Taihú Pire
CIFASIS, Rosario, Argentina
RoboticsComputer VisionSLAMSensor FusionArtificial Intelligence