GDA-YOLO11: Amodal Instance Segmentation for Occlusion-Robust Robotic Fruit Harvesting

📅 2026-02-27
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study addresses the challenge of incomplete detection and inaccurate localization of fruits under occlusion, which severely compromises robotic harvesting efficiency. To this end, we propose the GDA-YOLO11 framework, which introduces non-modal instance segmentation to agricultural robotic harvesting for the first time. By enhancing the YOLO architecture with an asymmetric mask loss function, our method infers complete fruit masks that include occluded regions. Furthermore, it estimates optimal grasp points via Euclidean distance transform and projects them into 3D space, thereby closing the perception-to-action loop. Experimental results demonstrate that the proposed approach achieves a mAP@50 of 0.914, representing a 1.0% improvement over the baseline. Notably, in moderate-to-high occlusion scenarios, harvesting success rates increase by 3.5%, reaching up to 85.18%, significantly enhancing system robustness and perception-action integration efficiency.

Technology Category

Application Category

📝 Abstract
Occlusion remains a critical challenge in robotic fruit harvesting, as undetected or inaccurately localised fruits often results in substantial crop losses. To mitigate this issue, we propose a harvesting framework using a new amodal segmentation model, GDA-YOLO11, which incorporates architectural improvements and an updated asymmetric mask loss. The proposed model is trained on a modified version of a public citrus dataset and evaluated on both the base dataset and occlusion-sensitive subsets with varying occlusion levels. Within the framework, full fruit masks, including invisible regions, are inferred by GDA-YOLO11, and picking points are subsequently estimated using the Euclidean distance transform. These points are then projected into 3D coordinates for robotic harvesting execution. Experiments were conducted using real citrus fruits in a controlled environment simulating occlusion scenarios. Notably, to the best of our knowledge, this study provides the first practical demonstration of amodal instance segmentation in robotic fruit harvesting. GDA-YOLO11 achieves a precision of 0.844, recall of 0.846, mAP@50 of 0.914, and mAP@50:95 of 0.636, outperforming YOLO11n by 5.1%, 1.3%, and 1.0% in precision, mAP@50, and mAP@50:95, respectively. The framework attains harvesting success rates of 92.59%, 85.18%, 48.14%, and 22.22% at zero to high occlusion levels, improving success by 3.5% under medium and high occlusion. These findings demonstrate that GDA-YOLO11 enhances occlusion robust segmentation and streamlines perception-to-action integration, paving the way for more reliable autonomous systems in agriculture.
Problem

Research questions and friction points this paper is trying to address.

occlusion
robotic fruit harvesting
amodal instance segmentation
crop loss
fruit detection
Innovation

Methods, ideas, or system contributions that make the work stand out.

amodal instance segmentation
occlusion-robust harvesting
GDA-YOLO11
asymmetric mask loss
perception-to-action integration
🔎 Similar Papers
2023-12-13Artificial Intelligence in AgricultureCitations: 71