UniFucGrasp: Human-Hand-Inspired Unified Functional Grasp Annotation Strategy and Dataset for Diverse Dexterous Hands

📅 2025-08-05
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing dexterous grasping datasets emphasize stability but lack functional annotations for tasks such as bottle-cap opening or cup-handle grasping, and predominantly rely on expensive high-DOF hands (e.g., Shadow Hand), limiting generalization across diverse robotic hands. Method: We propose a human-inspired, unified functional grasping annotation framework that integrates biomechanically grounded motion mapping with geometric force-closure analysis. Using low-DOF human motion capture, we drive multi-hand-type mapping and synthesize the first cross-hand-type, multifunctional grasping dataset (UFG). Contribution/Results: Our approach is cost-effective, enables efficient annotation, and exhibits strong generalizability. Evaluated in IsaacSim simulation and real-robot experiments, it significantly improves functional manipulation accuracy (+23.6%) and grasping stability. UFG establishes a scalable, functionality-oriented foundation for embodied intelligence, enabling robust dexterous manipulation across heterogeneous hand platforms.

Technology Category

Application Category

📝 Abstract
Dexterous grasp datasets are vital for embodied intelligence, but mostly emphasize grasp stability, ignoring functional grasps needed for tasks like opening bottle caps or holding cup handles. Most rely on bulky, costly, and hard-to-control high-DOF Shadow Hands. Inspired by the human hand's underactuated mechanism, we establish UniFucGrasp, a universal functional grasp annotation strategy and dataset for multiple dexterous hand types. Based on biomimicry, it maps natural human motions to diverse hand structures and uses geometry-based force closure to ensure functional, stable, human-like grasps. This method supports low-cost, efficient collection of diverse, high-quality functional grasps. Finally, we establish the first multi-hand functional grasp dataset and provide a synthesis model to validate its effectiveness. Experiments on the UFG dataset, IsaacSim, and complex robotic tasks show that our method improves functional manipulation accuracy and grasp stability, enables efficient generalization across diverse robotic hands, and overcomes annotation cost and generalization challenges in dexterous grasping. The project page is at https://haochen611.github.io/UFG.
Problem

Research questions and friction points this paper is trying to address.

Existing grasp datasets ignore functional grasps for tasks
High-DOF Shadow Hands are costly and hard to control
Lack of unified functional grasp annotation for diverse hands
Innovation

Methods, ideas, or system contributions that make the work stand out.

Human-hand-inspired underactuated grasp annotation strategy
Geometry-based force closure for functional stable grasps
Multi-hand functional grasp dataset and synthesis model
🔎 Similar Papers
No similar papers found.
H
Haoran Lin
School of Artificial Intelligence and Robotics, Hunan University, China; National Engineering Research Center of Robot Visual Perception and Control Technology, Hunan University, China
Wenrui Chen
Wenrui Chen
Hunan University
RoboticsHandsGraspingDexterous ManipulationHuman-Robot Collaboration
X
Xianchi Chen
School of Artificial Intelligence and Robotics, Hunan University, China; National Engineering Research Center of Robot Visual Perception and Control Technology, Hunan University, China
F
Fan Yang
School of Artificial Intelligence and Robotics, Hunan University, China; National Engineering Research Center of Robot Visual Perception and Control Technology, Hunan University, China
Q
Qiang Diao
School of Artificial Intelligence and Robotics, Hunan University, China
W
Wenxin Xie
College of Mechanical and Vehicle Engineering, Hunan University, China
S
Sijie Wu
College of Mechanical and Vehicle Engineering, Hunan University, China
Kailun Yang
Kailun Yang
Professor. School of Artificial Intelligence and Robotics, Hunan University (HNU); KIT; UAH; ZJU
Computer VisionComputational OpticsIntelligent VehiclesAutonomous DrivingRobotics
M
Maojun Li
College of Mechanical and Vehicle Engineering, Hunan University, China
Y
Yaonan Wang
School of Artificial Intelligence and Robotics, Hunan University, China; National Engineering Research Center of Robot Visual Perception and Control Technology, Hunan University, China