🤖 AI Summary
This work addresses the challenge of transferring dexterous manipulation skills from humans to soft robotic hands, which exhibit extreme morphological differences and nonlinear compliance. To overcome this, the authors propose a force-aware, two-stage retargeting framework. Human demonstrations—captured in immersive virtual reality with synchronized contact forces and geometric information—are leveraged to explicitly model contact force distributions and geometry, enabling the transfer of functional intent rather than mere motion trajectories to a non-anthropomorphic pneumatic soft hand. Real-time control is achieved through end-effector pose tracking combined with geodesic-weighted contact optimization. Experiments demonstrate that, compared to baseline methods, the proposed approach reduces fingertip trajectory tracking RMSE by up to 55% and variance by 69%, while significantly improving task success rates in both simulation and zero-shot real-world deployment.
📝 Abstract
We introduce SoftAct, a framework for teaching soft robot hands to perform human-like manipulation skills by explicitly reasoning about contact forces. Leveraging immersive virtual reality, our system captures rich human demonstrations, including hand kinematics, object motion, dense contact patches, and detailed contact force information. Unlike conventional approaches that retarget human joint trajectories, SoftAct employs a two-stage, force-aware retargeting algorithm. The first stage attributes demonstrated contact forces to individual human fingers and allocates robot fingers proportionally, establishing a force-balanced mapping between human and robot hands. The second stage performs online retargeting by combining baseline end-effector pose tracking with geodesic-weighted contact refinements, using contact geometry and force magnitude to adjust robot fingertip targets in real time. This formulation enables soft robotic hands to reproduce the functional intent of human demonstrations while naturally accommodating extreme embodiment mismatch and nonlinear compliance. We evaluate SoftAct on a suite of contact-rich manipulation tasks using a custom non-anthropomorphic pneumatic soft robot hand. SoftAct's controller reduces fingertip trajectory tracking RMSE by up to 55 percent and reduces tracking variance by up to 69 percent compared to kinematic and learning-based baselines. At the policy level, SoftAct achieves consistently higher success in zero-shot real-world deployment and in simulation. These results demonstrate that explicitly modeling contact geometry and force distribution is essential for effective skill transfer to soft robotic hands, and cannot be recovered through kinematic imitation alone. Project videos and additional details are available at https://soft-act.github.io/.