Shear-based Grasp Control for Multi-fingered Underactuated Tactile Robotic Hands

📅 2025-03-21
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of safe and dexterous manipulation of fragile, deformable objects by multi-finger underactuated tactile hands under external disturbances. We propose a closed-loop adaptive grasping method grounded in shear-force feedback. Methodologically, we introduce— for the first time—a parallel tactile feedback control framework unifying inter-finger shear-force actuation, integrating microvision-based tactile sensing (microTac), asynchronous image acquisition, supervised deep learning, and transfer learning to achieve cross-sensor consistent modeling; this is coupled with underactuated dynamics modeling and multi-point synchronous force feedback control. Our key contribution is the realization of human-inspired reflexive tactile manipulation: experiments demonstrate significantly enhanced operational robustness and anthropomorphic dexterity—evidenced by deformation-free load-bearing on flexible cups, active compensation against dynamic center-of-mass shifts to prevent tipping, and seamless human–robot collaborative tactile guidance.

Technology Category

Application Category

📝 Abstract
This paper presents a shear-based control scheme for grasping and manipulating delicate objects with a Pisa/IIT anthropomorphic SoftHand equipped with soft biomimetic tactile sensors on all five fingertips. These `microTac' tactile sensors are miniature versions of the TacTip vision-based tactile sensor, and can extract precise contact geometry and force information at each fingertip for use as feedback into a controller to modulate the grasp while a held object is manipulated. Using a parallel processing pipeline, we asynchronously capture tactile images and predict contact pose and force from multiple tactile sensors. Consistent pose and force models across all sensors are developed using supervised deep learning with transfer learning techniques. We then develop a grasp control framework that uses contact force feedback from all fingertip sensors simultaneously, allowing the hand to safely handle delicate objects even under external disturbances. This control framework is applied to several grasp-manipulation experiments: first, retaining a flexible cup in a grasp without crushing it under changes in object weight; second, a pouring task where the center of mass of the cup changes dynamically; and third, a tactile-driven leader-follower task where a human guides a held object. These manipulation tasks demonstrate more human-like dexterity with underactuated robotic hands by using fast reflexive control from tactile sensing.
Problem

Research questions and friction points this paper is trying to address.

Control grasp for delicate objects using tactile feedback
Develop force and pose models via deep learning
Enable human-like dexterity in underactuated robotic hands
Innovation

Methods, ideas, or system contributions that make the work stand out.

Shear-based control for delicate object grasping
Deep learning models for tactile sensor feedback
Parallel processing for real-time tactile data
🔎 Similar Papers
No similar papers found.