MimicTouch: Leveraging Multi-modal Human Tactile Demonstrations for Contact-rich Manipulation

📅 2023-10-25
📈 Citations: 1
Influential: 0
📄 PDF
🤖 AI Summary
Tactile perception is critical for contact-intensive tasks such as peg insertion, yet existing robotic approaches rely predominantly on visual feedback—creating a modality mismatch between sensing (vision) and task requirements (touch). To address this, we propose MimicTouch, the first framework that directly acquires multimodal tactile data from bare-handed human demonstrations and learns tactile-guided manipulation policies. Its core contributions are: (1) a custom wearable tactile sensing system for the human hand; (2) a cross-embodiment imitation learning framework integrating behavioral cloning with online residual reinforcement learning to bridge morphological and dynamical disparities between human hands and robotic grippers; and (3) cross-modal transfer enabling tactile-action policy generalization. Evaluated on diverse contact-rich manipulation tasks, MimicTouch significantly outperforms vision-based and purely simulation-trained baselines, demonstrating effective transfer of human tactile strategies to robotic fine manipulation.
📝 Abstract
Tactile sensing is critical to fine-grained, contact-rich manipulation tasks, such as insertion and assembly. Prior research has shown the possibility of learning tactile-guided policy from teleoperated demonstration data. However, to provide the demonstration, human users often rely on visual feedback to control the robot. This creates a gap between the sensing modality used for controlling the robot (visual) and the modality of interest (tactile). To bridge this gap, we introduce"MimicTouch", a novel framework for learning policies directly from demonstrations provided by human users with their hands. The key innovations are i) a human tactile data collection system which collects multi-modal tactile dataset for learning human's tactile-guided control strategy, ii) an imitation learning-based framework for learning human's tactile-guided control strategy through such data, and iii) an online residual RL framework to bridge the embodiment gap between the human hand and the robot gripper. Through comprehensive experiments, we highlight the efficacy of utilizing human's tactile-guided control strategy to resolve contact-rich manipulation tasks. The project website is at https://sites.google.com/view/MimicTouch.
Problem

Research questions and friction points this paper is trying to address.

Bridges gap between visual and tactile sensing.
Learns tactile-guided policy from human hand demonstrations.
Resolves contact-rich manipulation tasks using tactile strategies.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Multi-modal tactile data collection
Imitation learning framework
Online residual RL framework
🔎 Similar Papers
No similar papers found.