Bring Your Own Grasp Generator: Leveraging Robot Grasp Generation for Prosthetic Grasping

📅 2025-03-01
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Upper-limb prosthesis users face significant challenges in controlling multi-degree-of-freedom (multi-DOF) devices due to high cognitive and interaction burdens. Method: This work proposes an “eye-in-hand” shared-autonomy prosthetic hand grasping system that operates solely on monocular RGB video—enabling real-time 3D object geometry reconstruction, hand motion tracking, and user-intent-driven grasp selection without depth sensors. It pioneers the integration of robotic grasp generation paradigms into prosthetic control, establishing an end-to-end, intent-driven shared-autonomy architecture that automatically configures the DOFs of the Hannes prosthetic hand. Results: In evaluations with both able-bodied participants and upper-limb amputees, the system significantly improves grasping speed and drastically reduces interaction steps compared to conventional multi-DOF control baselines, demonstrating strong clinical feasibility and practical utility.

Technology Category

Application Category

📝 Abstract
One of the most important research challenges in upper-limb prosthetics is enhancing the user-prosthesis communication to closely resemble the experience of a natural limb. As prosthetic devices become more complex, users often struggle to control the additional degrees of freedom. In this context, leveraging shared-autonomy principles can significantly improve the usability of these systems. In this paper, we present a novel eye-in-hand prosthetic grasping system that follows these principles. Our system initiates the approach-to-grasp action based on user's command and automatically configures the DoFs of a prosthetic hand. First, it reconstructs the 3D geometry of the target object without the need of a depth camera. Then, it tracks the hand motion during the approach-to-grasp action and finally selects a candidate grasp configuration according to user's intentions. We deploy our system on the Hannes prosthetic hand and test it on able-bodied subjects and amputees to validate its effectiveness. We compare it with a multi-DoF prosthetic control baseline and find that our method enables faster grasps, while simplifying the user experience. Code and demo videos are available online at https://hsp-iit.github.io/byogg/.
Problem

Research questions and friction points this paper is trying to address.

Enhance user-prosthesis communication for natural limb experience
Simplify control of complex prosthetic devices with shared-autonomy
Improve grasp speed and user experience in prosthetic hands
Innovation

Methods, ideas, or system contributions that make the work stand out.

Eye-in-hand system for prosthetic grasping
3D object reconstruction without depth camera
Automated grasp configuration based on user intent
🔎 Similar Papers
No similar papers found.