🤖 AI Summary
Low Sim2Real fidelity and inconsistent experimental setups hinder reliable evaluation in fabric manipulation. To address this, we propose DRAPER: (1) a novel simulation-adaptive modeling mechanism explicitly designed to emulate realistic grasping errors—such as misdetection and multi-layer grasps; (2) a lightweight visual preprocessing method to bridge the perception-domain gap between simulation and reality; and (3) a tweezer-style extended gripper with a robust grasping pipeline. Integrated with a deep learning–based cloth-shaping controller, DRAPER is validated across diverse fabric properties (material, size, color) and multiple robotic platforms. Experiments demonstrate significant improvements in real-world grasping success rate and task completion consistency. Crucially, DRAPER enables reproducible, fair, cross-method and cross-platform benchmarking—thereby establishing a standardized evaluation framework for fabric manipulation research.
📝 Abstract
Comparing robotic cloth-manipulation systems in a real-world setup is challenging. The fidelity gap between simulation-trained cloth neural controllers and real-world operation hinders the reliable deployment of these methods in physical trials. Inconsistent experimental setups and hardware limitations among different approaches obstruct objective evaluations. This study demonstrates a reliable real-world comparison of different simulation-trained neural controllers on both flattening and folding tasks with different types of fabrics varying in material, size, and colour. We introduce the DRAPER framework to enable this comprehensive study, which reliably reflects the true capabilities of these neural controllers. It specifically addresses real-world grasping errors, such as misgrasping and multilayer grasping, through real-world adaptations of the simulation environment to provide data trajectories that closely reflect real-world grasping scenarios. It also employs a special set of vision processing techniques to close the simulation-to-reality gap in the perception. Furthermore, it achieves robust grasping by adopting a tweezer-extended gripper and a grasping procedure. We demonstrate DRAPER's generalisability across different deep-learning methods and robotic platforms, offering valuable insights to the cloth manipulation research community.