🤖 AI Summary
This work addresses the sim-to-real control challenge for soft robots with deformable freeform surfaces. We propose a correspondence-free functional transfer learning framework that jointly models the deformation function space and confidence maps via deep neural networks, enabling robust shape mapping directly from raw 3D point clouds or sparse/incomplete marker data. The framework seamlessly integrates neural inverse kinematics and closed-loop shape control. Our key contribution is the first application of functional modeling to soft surface control—eliminating reliance on explicit point correspondences and significantly improving generalization across devices and sensing modalities (e.g., point clouds vs. markers). We validate the method on four pneumatic soft robotic platforms: deformable membranes, robotic mannequins, and dual soft arms. Results demonstrate high-precision, real-time shape control with strong robustness to sensor noise, occlusion, and domain shift.
📝 Abstract
This paper presents a correspondence-free, function-based sim-to-real learning method for controlling deformable freeform surfaces. Unlike traditional sim-to-real transfer methods that strongly rely on marker points with full correspondences, our approach simultaneously learns a deformation function space and a confidence map -- both parameterized by a neural network -- to map simulated shapes to their real-world counterparts. As a result, the sim-to-real learning can be conducted by input from either a 3D scanner as point clouds (without correspondences) or a motion capture system as marker points (tolerating missed markers). The resultant sim-to-real transfer can be seamlessly integrated into a neural network-based computational pipeline for inverse kinematics and shape control. We demonstrate the versatility and adaptability of our method on both vision devices and across four pneumatically actuated soft robots: a deformable membrane, a robotic mannequin, and two soft manipulators.