WHED: A Wearable Hand Exoskeleton for Natural, High-Quality Demonstration Collection

📅 2026-02-20
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of acquiring high-fidelity human demonstration data for dexterous multi-finger manipulation, which is hindered by occlusion, complex hand kinematics, and dense contact interactions. To overcome this, the authors propose a wearability-prioritized hand exoskeleton system featuring a pose-tolerant thumb coupling mechanism, linkage-driven finger interfaces, and passive adaptive structures integrated with multimodal sensing—including joint encoders, AR-based end-effector pose estimation, and synchronized wrist-mounted vision. This design preserves natural hand motion while enabling precise kinematic mapping to a robotic hand. An end-to-end synchronous capture-and-replay pipeline is implemented, successfully collecting representative manipulation sequences such as precision pinch and whole-hand enveloping grasps, and demonstrating qualitative consistency between human demonstrations and robotic replay.

Technology Category

Application Category

📝 Abstract
Scalable learning of dexterous manipulation remains bottlenecked by the difficulty of collecting natural, high-fidelity human demonstrations of multi-finger hands due to occlusion, complex hand kinematics, and contact-rich interactions. We present WHED, a wearable hand-exoskeleton system designed for in-the-wild demonstration capture, guided by two principles: wearability-first operation for extended use and a pose-tolerant, free-to-move thumb coupling that preserves natural thumb behaviors while maintaining a consistent mapping to the target robot thumb degrees of freedom. WHED integrates a linkage-driven finger interface with passive fit accommodation, a modified passive hand with robust proprioceptive sensing, and an onboard sensing/power module. We also provide an end-to-end data pipeline that synchronizes joint encoders, AR-based end-effector pose, and wrist-mounted visual observations, and supports post-processing for time alignment and replay. We demonstrate feasibility on representative grasping and manipulation sequences spanning precision pinch and full-hand enclosure grasps, and show qualitative consistency between collected demonstrations and replayed executions.
Problem

Research questions and friction points this paper is trying to address.

dexterous manipulation
human demonstration
hand exoskeleton
high-fidelity data collection
multi-finger hands
Innovation

Methods, ideas, or system contributions that make the work stand out.

wearable hand exoskeleton
dexterous manipulation
thumb coupling
high-fidelity demonstration
data pipeline
🔎 Similar Papers
No similar papers found.
M
Mingzhang Zhu
Department of Mechanical and Aerospace Engineering, UCLA, Los Angeles, CA, USA.
Alvin Zhu
Alvin Zhu
University of California Los Angeles
roboticsdeep learningreinforcement learning
J
Jose Victor S. H. Ramos
Department of Mechanical and Aerospace Engineering, UCLA, Los Angeles, CA, USA.
Beom Jun Kim
Beom Jun Kim
Professor of Physics, Sungkyunkwan University, Korea
statistical physicscomplex networks
Y
Yike Shi
Department of Computer Science, UCLA, Los Angeles, CA, USA.
Y
Yufeng Wu
Department of Mechanical and Aerospace Engineering, UCLA, Los Angeles, CA, USA.
R
Ruochen Hou
Department of Mechanical and Aerospace Engineering, UCLA, Los Angeles, CA, USA.
Quanyou Wang
Quanyou Wang
PhD Student at UCLA
Robotics
E
Eric Song
Department of Computer Science, UCLA, Los Angeles, CA, USA.
T
Tony Fan
Department of Electrical Engineering, UCLA, Los Angeles, CA, USA.
Y
Yuchen Cui
Department of Computer Science, UCLA, Los Angeles, CA, USA.
D
Dennis W. Hong
Department of Mechanical and Aerospace Engineering, UCLA, Los Angeles, CA, USA.