Tac-Man: Tactile-Informed Prior-Free Manipulation of Articulated Objects

📅 2024-03-04
🏛️ IEEE Transactions on robotics
📈 Citations: 15
Influential: 0
📄 PDF
🤖 AI Summary
Robust manipulation of articulated objects (e.g., doors, drawers) in human environments is hindered by the lack of kinematic priors, especially when object structure is unknown. Method: We propose a fully model-free, touch-driven manipulation paradigm grounded in high-resolution tactile sensing. Our approach establishes a closed-loop control framework integrating real-time contact-state estimation, adaptive force regulation, and model-free policy optimization—requiring no CAD models, joint parameters, or visual reconstruction. Contribution/Results: The core innovation replaces conventional kinematic modeling with high-fidelity contact modeling, enabling dynamic, stable interaction using tactile feedback alone. Evaluated on both physical robots and large-scale simulation, our method achieves >98% success rate—significantly outperforming state-of-the-art model-based and vision-based approaches. It demonstrates exceptional generalization and robustness to unknown objects, parametric uncertainties, and external disturbances.

Technology Category

Application Category

📝 Abstract
Integrating robots into human-centric environments such as homes, necessitates advanced manipulation skills as robotic devices will need to engage with articulated objects such as doors and drawers. Key challenges in robotic manipulation of articulated objects are the unpredictability and diversity of these objects' internal structures, which render models based on object kinematics priors, both explicit and implicit, and inadequate. Their reliability is significantly diminished by pre-interaction ambiguities, imperfect structural parameters, encounters with unknown objects, and unforeseen disturbances. Here, we present a prior-free strategy, Tac-Man, focusing on maintaining stable robot-object contact during manipulation. Without relying on object priors, Tac-Man leverages tactile feedback to enable robots to proficiently handle a variety of articulated objects, including those with complex joints, even when influenced by unexpected disturbances. Demonstrated in both real-world experiments and extensive simulations, it consistently achieves near-perfect success in dynamic and varied settings, outperforming existing methods. Our results indicate that tactile sensing alone suffices for managing diverse articulated objects, offering greater robustness and generalization than prior-based approaches. This underscores the importance of detailed contact modeling in complex manipulation tasks, especially with articulated objects. Advancements in tactile-informed approaches significantly expand the scope of robotic applications in human-centric environments, particularly where accurate models are difficult to obtain.
Problem

Research questions and friction points this paper is trying to address.

Managing articulated objects without prior kinematic models
Handling unpredictability and diversity in internal object structures
Overcoming pre-interaction ambiguities and unexpected disturbances
Innovation

Methods, ideas, or system contributions that make the work stand out.

Tactile feedback enables prior-free manipulation
Maintains stable contact during object interaction
Handles diverse articulated objects without models
🔎 Similar Papers
No similar papers found.
Zihang Zhao
Zihang Zhao
PhD Candidate, Peking University
manipulationtactile robotics
Yuyang Li
Yuyang Li
Institute for AI, Peking University
Robotic ManipulationTactile SensingHuman-Object Interaction
W
Wanlin Li
Beijing Institute for General Artificial Intelligence
Z
Zhenghao Qi
Institute for Artificial Intelligence, Peking University; Department of Automation, Tsinghua University
Lecheng Ruan
Lecheng Ruan
HIT / UCLA / PKU
RoboticsControlKnowledge Representation
Yixin Zhu
Yixin Zhu
Assistant Professor, Peking University
Computer VisionVisual ReasoningHuman-Robot Teaming
K
K. Althoefer
School of Engineering and Materials Science, Queen Mary University of London