🤖 AI Summary
Global apple production faces mounting challenges due to declining natural pollinators and the low efficiency, poor robustness, and climate sensitivity of manual pollination. To address these issues, this work proposes a vision-guided robotic pollination system capable of single-flower-level precise manipulation—the first to achieve closed-loop autonomous control based on real-time floral organ recognition and 3D localization. The system integrates high-resolution multispectral imaging, an enhanced YOLOv8 model for organ detection, point cloud registration for spatial mapping, and coordinated control of a lightweight robotic manipulator to enable accurate, targeted pollen delivery to individual flowers. Field trials in commercial orchards demonstrate a pollination accuracy of 92.7%, an average operation time of less than 8 seconds per flower, and a 31% increase in fruit set compared to manual assisted pollination. This study establishes a novel paradigm for high-precision, autonomous agricultural robotics in complex, unstructured outdoor environments.