FreeTacMan: Robot-free Visuo-Tactile Data Collection System for Contact-rich Manipulation

📅 2025-06-02
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Current visuo-tactile manipulation data collection suffers from low efficiency, limited sensor modalities, rigid mechanical designs, and absence of natural haptic feedback. To address these limitations, we propose a human-centered, robot-free wearable paradigm for visuo-tactile data acquisition. Our approach features a soft, wearable bimodal gripper enabling high-fidelity tactile sensing and intuitive manual manipulation. It integrates high-precision optical motion capture with a multi-sensor temporal synchronization mechanism to ensure strict alignment between tactile and visual signals. Furthermore, we introduce an end-to-end visuo-tactile policy learning framework. The system significantly improves both data collection efficiency and quality, enabling training of high-performance policies on multiple contact-intensive manipulation tasks. All code and hardware designs are publicly released to enhance reproducibility and scalability in visuo-tactile manipulation research.

Technology Category

Application Category

📝 Abstract
Enabling robots with contact-rich manipulation remains a pivotal challenge in robot learning, which is substantially hindered by the data collection gap, including its inefficiency and limited sensor setup. While prior work has explored handheld paradigms, their rod-based mechanical structures remain rigid and unintuitive, providing limited tactile feedback and posing challenges for human operators. Motivated by the dexterity and force feedback of human motion, we propose FreeTacMan, a human-centric and robot-free data collection system for accurate and efficient robot manipulation. Concretely, we design a wearable data collection device with dual visuo-tactile grippers, which can be worn by human fingers for intuitive and natural control. A high-precision optical tracking system is introduced to capture end-effector poses, while synchronizing visual and tactile feedback simultaneously. FreeTacMan achieves multiple improvements in data collection performance compared to prior works, and enables effective policy learning for contact-rich manipulation tasks with the help of the visuo-tactile information. We will release the work to facilitate reproducibility and accelerate research in visuo-tactile manipulation.
Problem

Research questions and friction points this paper is trying to address.

Addresses inefficiency in robot manipulation data collection
Improves tactile feedback with wearable visuo-tactile grippers
Enables effective learning for contact-rich manipulation tasks
Innovation

Methods, ideas, or system contributions that make the work stand out.

Wearable dual visuo-tactile grippers for fingers
High-precision optical tracking for poses
Synchronized visual and tactile feedback system
🔎 Similar Papers
No similar papers found.
L
Longyan Wu
Fudan University
Checheng Yu
Checheng Yu
Nanjing University
RoboticsRL
J
Jieji Ren
Shanghai Jiao Tong University
L
Li Chen
The University of Hong Kong
R
Ran Huang
Fudan University
G
Guoying Gu
Shanghai Jiao Tong University
H
Hongyang Li
The University of Hong Kong, Shanghai Innovation Institute