🤖 AI Summary
This work proposes a dual-arm surgical assistant robot that addresses the limitations of both manual instrument handling—prone to fatigue—and existing robotic systems, which rely on predefined trajectories and lack dynamic adaptability and safety. For the first time, the system integrates a zero-shot vision-language model with real-time minimum-distance obstacle awareness to autonomously generate grasping and delivery trajectories in response to surgeon commands, without requiring preprogrammed paths. Dynamic obstacle avoidance and self-collision prevention are unified within a single quadratic programming framework to ensure safe, smooth motion. Experimental results demonstrate an 83.33% success rate in instrument handover under dynamic conditions, with consistently collision-free and stable operation throughout the procedure.
📝 Abstract
During surgery, scrub nurses are required to frequently deliver surgical instruments to surgeons, which can lead to physical fatigue and decreased focus. Robotic scrub nurses provide a promising solution that can replace repetitive tasks and enhance efficiency. Existing research on robotic scrub nurses relies on predefined paths for instrument delivery, which limits their generalizability and poses safety risks in dynamic environments. To address these challenges, we present a collision-free dual-arm surgical assistive robot capable of performing instrument delivery. A vision-language model is utilized to automatically generate the robot's grasping and delivery trajectories in a zero-shot manner based on surgeons' instructions. A real-time obstacle minimum distance perception method is proposed and integrated into a unified quadratic programming framework. This framework ensures reactive obstacle avoidance and self-collision prevention during the dual-arm robot's autonomous movement in dynamic environments. Extensive experimental validations demonstrate that the proposed robotic system achieves an 83.33% success rate in surgical instrument delivery while maintaining smooth, collision-free movement throughout all trials. The project page and source code are available at https://give-me-scissors.github.io/.