🤖 AI Summary
This work addresses the challenge of simultaneous localization and mapping (SLAM) for agricultural robots operating in complex horticultural greenhouse environments, where the lack of representative multimodal real-world datasets has hindered progress. To bridge this gap, we present the first greenhouse dataset spanning the full growing seasons of strawberry and raspberry crops, integrating dual 3D LiDARs, a quad-camera RGB rig, IMU, GNSS, and wheel odometry. High-precision ground-truth trajectories at multiple levels are generated using a total station, AprilTags, and LiDAR-inertial odometry. This benchmark uniquely offers cross-seasonal, multimodal data with accurate ground truth, enabling evaluation of SLAM algorithms—from marker-assisted to marker-free approaches—in realistic settings. The dataset, along with calibration parameters, reference trajectories, and baseline results, is publicly released, revealing performance limitations of current state-of-the-art methods in greenhouse conditions and providing a critical resource for advancing agricultural robotics perception.
📝 Abstract
Agricultural robotics is gaining increasing relevance in both research and real-world deployment. As these systems are expected to operate autonomously in more complex tasks, the availability of representative real-world datasets becomes essential. While domains such as urban and forestry robotics benefit from large and established benchmarks, horticultural environments remain comparatively under-explored despite the economic significance of this sector. To address this gap, we present HortiMulti, a multimodal, cross-season dataset collected in commercial strawberry and raspberry polytunnels across an entire growing season, capturing substantial appearance variation, dynamic foliage, specular reflections from plastic covers, severe perceptual aliasing, and GNSS-unreliable conditions, all of which directly degrade existing localisation and perception algorithms. The sensor suite includes two 3D LiDARs, four RGB cameras, an IMU, GNSS, and wheel odometry. Ground truth trajectories are derived from a combination of Total Station surveying, AprilTag fiducial markers, and LiDAR-inertial odometry, spanning dense, sparse, and marker-free coverage to support evaluation under both controlled and realistic conditions. We release time-synchronised raw measurements, calibration files, reference trajectories, and baseline benchmarks for visual, LiDAR, and multi-sensor SLAM, with results confirming that current state-of-the-art methods remain inadequate for reliable polytunnel deployment, establishing HortiMulti as a one-stop resource for developing and testing robotic perception systems in horticulture environments.