Haptic-ACT: Bridging Human Intuition with Compliant Robotic Manipulation via Immersive VR

📅 2024-09-18
🏛️ arXiv.org
📈 Citations: 2
Influential: 0
📄 PDF
🤖 AI Summary
High-cost robot teleoperation demonstrations and challenges in modeling compliant manipulation motivate this work. We propose Haptic-ACT, a tactile-enhanced VR teleoperation framework for imitation learning. Our method introduces an immersive VR teleoperation system capable of real-time force feedback acquisition, and pioneers a novel architecture that integrates tactile signal-driven action segmentation with a Transformer backbone to explicitly model contact dynamics and temporal action dependencies. The framework unifies MuJoCo simulation, haptic rendering, and deployment on physical robots. Experiments demonstrate that Haptic-ACT significantly improves compliant manipulation performance over baseline ACT in both simulation and real-robot settings: demonstrator fingertip contact forces decrease by 32.7%, and the system successfully completes 50 fine-grained pick-and-place tasks. These results validate the critical role of tactile guidance in acquiring high-fidelity demonstration data for robust imitation learning.

Technology Category

Application Category

📝 Abstract
Robotic manipulation is essential for the widespread adoption of robots in industrial and home settings and has long been a focus within the robotics community. Advances in artificial intelligence have introduced promising learning-based methods to address this challenge, with imitation learning emerging as particularly effective. However, efficiently acquiring high-quality demonstrations remains a challenge. In this work, we introduce an immersive VR-based teleoperation setup designed to collect demonstrations from a remote human user. We also propose an imitation learning framework called Haptic Action Chunking with Transformers (Haptic-ACT). To evaluate the platform, we conducted a pick-and-place task and collected 50 demonstration episodes. Results indicate that the immersive VR platform significantly reduces demonstrator fingertip forces compared to systems without haptic feedback, enabling more delicate manipulation. Additionally, evaluations of the Haptic-ACT framework in both the MuJoCo simulator and on a real robot demonstrate its effectiveness in teaching robots more compliant manipulation compared to the original ACT. Additional materials are available at https://sites.google.com/view/hapticact.
Problem

Research questions and friction points this paper is trying to address.

Efficiently acquiring high-quality robotic manipulation demonstrations.
Reducing demonstrator fingertip forces using immersive VR with haptic feedback.
Teaching robots compliant manipulation via the Haptic-ACT framework.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Immersive VR teleoperation for demonstration collection
Haptic-ACT framework for compliant robotic manipulation
Reduced fingertip forces with haptic feedback
🔎 Similar Papers
No similar papers found.
K
Kelin Li
Robot Intelligence Lab, Imperial College London, and Extend Robotics
S
Shubham M. Wagh
Extend Robotics
Nitish Sharma
Nitish Sharma
Extend Robotics
S
Saksham Bhadani
Extend Robotics
W
Wei Chen
Robot Intelligence Lab, Imperial College London
C
Chang Liu
Extend Robotics
Petar Kormushev
Petar Kormushev
Imperial College London, Director of Robot Intelligence Lab
Robot IntelligenceRobot LearningReinforcement LearningMachine LearningRobotics