🤖 AI Summary
To resolve the trade-off between anthropomorphic functionality and actuator minimization in tendon-driven multifingered robotic hands, this paper presents a 3D-printed anthropomorphic five-finger soft robotic hand driven by only two actuators. Joint-selective motion is achieved via antagonistic dual-tendon actuation, while a novel fully 3D-printed, assembly-free, monolithic embedded flexible tactile fingertip sensor enables closed-loop tactile feedback. The design integrates innovative algorithms for contact-adaptive grasping, millisecond-level slip detection, and real-time grasp pose adjustment, supporting both human-hand motion mapping and autonomous adaptive grasping. Experiments demonstrate robust adaptive grasping across diverse object geometries: automatic pose stabilization upon contact and synchronous grip force–pose optimization during slip events. Four comparative experiments confirm high gesture control accuracy, strong environmental adaptability, and superior human–robot collaboration stability. The open-source design ensures low cost and high reproducibility.
📝 Abstract
For tendon-driven multi-fingered robotic hands, ensuring grasp adaptability while minimizing the number of actuators needed to provide human-like functionality is a challenging problem. Inspired by the Pisa/IIT SoftHand, this paper introduces a 3D-printed, highly-underactuated, five-finger robotic hand named the Tactile SoftHand-A, which features only two actuators. The dual-tendon design allows for the active control of specific (distal or proximal interphalangeal) joints to adjust the hand's grasp gesture. We have also developed a new design of fully 3D-printed tactile sensor that requires no hand assembly and is printed directly as part of the robotic finger. This sensor is integrated into the fingertips and combined with the antagonistic tendon mechanism to develop a human-hand-guided tactile feedback grasping system. The system can actively mirror human hand gestures, adaptively stabilize grasp gestures upon contact, and adjust grasp gestures to prevent object movement after detecting slippage. Finally, we designed four different experiments to evaluate the novel fingers coupled with the antagonistic mechanism for controlling the robotic hand's gestures, adaptive grasping ability, and human-hand-guided tactile feedback grasping capability. The experimental results demonstrate that the Tactile SoftHand-A can adaptively grasp objects of a wide range of shapes and automatically adjust its gripping gestures upon detecting contact and slippage. Overall, this study points the way towards a class of low-cost, accessible, 3D-printable, underactuated human-like robotic hands, and we openly release the designs to facilitate others to build upon this work. This work is Open-sourced at github.com/SoutheastWind/Tactile_SoftHand_A