🤖 AI Summary
To address the challenges of cherry tomato yield estimation and unmanned ground vehicle (UGV) deployment limitations in GPS-denied greenhouse environments, this paper proposes a lightweight UAV-based sensing system. The method innovatively integrates LiDAR-inertial odometry with a 3D multi-object tracking algorithm to enable real-time fruit counting and weight estimation under severe occlusion and without GNSS support. RGB-D cameras and 3D LiDAR are synergistically employed for high-precision fruit localization and volumetric-to-weight mapping. Evaluated on a real-world commercial greenhouse harvesting-row dataset, the system achieves 94.4% counting accuracy and 87.5% weight estimation accuracy, completing a 13.2-meter flight in just 10.5 seconds. This work delivers an efficient, minimally intrusive, and practically deployable solution for intelligent, precision monitoring in labor-constrained greenhouse operations.
📝 Abstract
As the agricultural workforce declines and labor costs rise, robotic yield estimation has become increasingly important. While unmanned ground vehicles (UGVs) are commonly used for indoor farm monitoring, their deployment in greenhouses is often constrained by infrastructure limitations, sensor placement challenges, and operational inefficiencies. To address these issues, we develop a lightweight unmanned aerial vehicle (UAV) equipped with an RGB-D camera, a 3D LiDAR, and an IMU sensor. The UAV employs a LiDAR-inertial odometry algorithm for precise navigation in GNSS-denied environments and utilizes a 3D multi-object tracking algorithm to estimate the count and weight of cherry tomatoes. We evaluate the system using two dataset: one from a harvesting row and another from a growing row. In the harvesting-row dataset, the proposed system achieves 94.4% counting accuracy and 87.5% weight estimation accuracy within a 13.2-meter flight completed in 10.5 seconds. For the growing-row dataset, which consists of occluded unripened fruits, we qualitatively analyze tracking performance and highlight future research directions for improving perception in greenhouse with strong occlusions. Our findings demonstrate the potential of UAVs for efficient robotic yield estimation in commercial greenhouses.