🤖 AI Summary
This work addresses the challenge of safe and dexterous manipulation of fragile, deformable objects by multi-finger underactuated tactile hands under external disturbances. We propose a closed-loop adaptive grasping method grounded in shear-force feedback. Methodologically, we introduce— for the first time—a parallel tactile feedback control framework unifying inter-finger shear-force actuation, integrating microvision-based tactile sensing (microTac), asynchronous image acquisition, supervised deep learning, and transfer learning to achieve cross-sensor consistent modeling; this is coupled with underactuated dynamics modeling and multi-point synchronous force feedback control. Our key contribution is the realization of human-inspired reflexive tactile manipulation: experiments demonstrate significantly enhanced operational robustness and anthropomorphic dexterity—evidenced by deformation-free load-bearing on flexible cups, active compensation against dynamic center-of-mass shifts to prevent tipping, and seamless human–robot collaborative tactile guidance.
📝 Abstract
This paper presents a shear-based control scheme for grasping and manipulating delicate objects with a Pisa/IIT anthropomorphic SoftHand equipped with soft biomimetic tactile sensors on all five fingertips. These `microTac' tactile sensors are miniature versions of the TacTip vision-based tactile sensor, and can extract precise contact geometry and force information at each fingertip for use as feedback into a controller to modulate the grasp while a held object is manipulated. Using a parallel processing pipeline, we asynchronously capture tactile images and predict contact pose and force from multiple tactile sensors. Consistent pose and force models across all sensors are developed using supervised deep learning with transfer learning techniques. We then develop a grasp control framework that uses contact force feedback from all fingertip sensors simultaneously, allowing the hand to safely handle delicate objects even under external disturbances. This control framework is applied to several grasp-manipulation experiments: first, retaining a flexible cup in a grasp without crushing it under changes in object weight; second, a pouring task where the center of mass of the cup changes dynamically; and third, a tactile-driven leader-follower task where a human guides a held object. These manipulation tasks demonstrate more human-like dexterity with underactuated robotic hands by using fast reflexive control from tactile sensing.