🤖 AI Summary
Robust SLAM evaluation for agricultural drones in complex farmland environments (flatlands, hilly terrain, terraces) is hindered by the absence of realistic, multimodal benchmark datasets. Method: This paper introduces the first publicly available, synchronous multimodal SLAM dataset specifically designed for agricultural scenarios. It uniquely integrates 4D radar and a fiber-optic inertial navigation system (FINS_RTK) to provide centimeter-accurate ground-truth trajectories, alongside synchronized 3D LiDAR, 4D radar, and IMU data, with full calibration parameters. The dataset covers both boundary-following and coverage flight patterns, explicitly addressing agriculture-specific challenges—including low-texture surfaces, repetitive structures, and dynamic vegetation. Contribution/Results: Six high-quality sequences are released and used to benchmark four state-of-the-art multimodal SLAM methods. Experimental results demonstrate the necessity and effectiveness of multimodal sensor fusion for enhancing localization robustness in challenging agricultural environments.
📝 Abstract
Multi-sensor Simultaneous Localization and Mapping (SLAM) is essential for Unmanned Aerial Vehicles (UAVs) performing agricultural tasks such as spraying, surveying, and inspection. However, real-world, multi-modal agricultural UAV datasets that enable research on robust operation remain scarce. To address this gap, we present AgriLiRa4D, a multi-modal UAV dataset designed for challenging outdoor agricultural environments. AgriLiRa4D spans three representative farmland types-flat, hilly, and terraced-and includes both boundary and coverage operation modes, resulting in six flight sequence groups. The dataset provides high-accuracy ground-truth trajectories from a Fiber Optic Inertial Navigation System with Real-Time Kinematic capability (FINS_RTK), along with synchronized measurements from a 3D LiDAR, a 4D Radar, and an Inertial Measurement Unit (IMU), accompanied by complete intrinsic and extrinsic calibrations. Leveraging its comprehensive sensor suite and diverse real-world scenarios, AgriLiRa4D supports diverse SLAM and localization studies and enables rigorous robustness evaluation against low-texture crops, repetitive patterns, dynamic vegetation, and other challenges of real agricultural environments. To further demonstrate its utility, we benchmark four state-of-the-art multi-sensor SLAM algorithms across different sensor combinations, highlighting the difficulty of the proposed sequences and the necessity of multi-modal approaches for reliable UAV localization. By filling a critical gap in agricultural SLAM datasets, AgriLiRa4D provides a valuable benchmark for the research community and contributes to advancing autonomous navigation technologies for agricultural UAVs. The dataset can be downloaded from: https://zhan994.github.io/AgriLiRa4D.