Vision-Guided Targeted Grasping and Vibration for Robotic Pollination in Controlled Environments

📅 2025-10-07
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In controlled-environment agriculture, the absence of wind pollination and prohibition of commercial bee colonies pose a critical challenge for crop pollination. To address this, we propose a vision-guided robotic pollination system integrating visual perception with physical modeling. Our method employs RGB-D sensing, 3D plant reconstruction, and coordinate-frame registration to achieve precise stem localization; combines collision-free grasp planning with vibration control based on a discrete elastic rod model to enable compliant grasping and controlled pollen release via a soft gripper. The key contribution is the first unified closed-loop vision–action framework that jointly optimizes visual target grasping and physics-informed vibration actuation, overcoming longstanding limitations in contact safety and dynamic responsiveness of pollination robots. Experiments demonstrate a 92.5% success rate in main-stem grasping; simulation-optimized vibration parameters significantly improve pollination efficiency while preserving floral integrity. Real-world deployment confirms the system’s stability and reliability.

Technology Category

Application Category

📝 Abstract
Robotic pollination offers a promising alternative to manual labor and bumblebee-assisted methods in controlled agriculture, where wind-driven pollination is absent and regulatory restrictions limit the use of commercial pollinators. In this work, we present and validate a vision-guided robotic framework that uses data from an end-effector mounted RGB-D sensor and combines 3D plant reconstruction, targeted grasp planning, and physics-based vibration modeling to enable precise pollination. First, the plant is reconstructed in 3D and registered to the robot coordinate frame to identify obstacle-free grasp poses along the main stem. Second, a discrete elastic rod model predicts the relationship between actuation parameters and flower dynamics, guiding the selection of optimal pollination strategies. Finally, a manipulator with soft grippers grasps the stem and applies controlled vibrations to induce pollen release. End-to-end experiments demonstrate a 92.5% main-stem grasping success rate, and simulation-guided optimization of vibration parameters further validates the feasibility of our approach, ensuring that the robot can safely and effectively perform pollination without damaging the flower. To our knowledge, this is the first robotic system to jointly integrate vision-based grasping and vibration modeling for automated precision pollination.
Problem

Research questions and friction points this paper is trying to address.

Developing robotic pollination for controlled agriculture environments
Integrating vision-guided grasping with vibration modeling for precision
Achieving safe and effective pollination without damaging flowers
Innovation

Methods, ideas, or system contributions that make the work stand out.

Vision-guided robotic framework with RGB-D sensor
3D plant reconstruction and targeted grasp planning
Soft grippers with controlled vibration for pollination
🔎 Similar Papers
2024-09-30Computers and Electronics in AgricultureCitations: 0
Jaehwan Jeong
Jaehwan Jeong
Samsung Electronics
AI SecurityComputer Security
T
Tuan-Anh Vu
Department of Mechanical & Aerospace Engineering, University of California, Los Angeles (UCLA), CA 90095, USA
R
Radha Lahoti
Department of Mechanical & Aerospace Engineering, University of California, Los Angeles (UCLA), CA 90095, USA
J
Jiawen Wang
Department of Mechanical & Aerospace Engineering, University of California, Los Angeles (UCLA), CA 90095, USA
V
Vivek Alumootil
Department of Computer Science, University of California, Los Angeles (UCLA), CA 90095, USA
Sangpil Kim
Sangpil Kim
Korea University
Computer Vision
M
M. Jawed
Department of Mechanical & Aerospace Engineering, University of California, Los Angeles (UCLA), CA 90095, USA