DRAPER: Towards a Robust Robot Deployment and Reliable Evaluation for Quasi-Static Pick-and-Place Cloth-Shaping Neural Controllers

📅 2024-09-23
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Low Sim2Real fidelity and inconsistent experimental setups hinder reliable evaluation in fabric manipulation. To address this, we propose DRAPER: (1) a novel simulation-adaptive modeling mechanism explicitly designed to emulate realistic grasping errors—such as misdetection and multi-layer grasps; (2) a lightweight visual preprocessing method to bridge the perception-domain gap between simulation and reality; and (3) a tweezer-style extended gripper with a robust grasping pipeline. Integrated with a deep learning–based cloth-shaping controller, DRAPER is validated across diverse fabric properties (material, size, color) and multiple robotic platforms. Experiments demonstrate significant improvements in real-world grasping success rate and task completion consistency. Crucially, DRAPER enables reproducible, fair, cross-method and cross-platform benchmarking—thereby establishing a standardized evaluation framework for fabric manipulation research.

Technology Category

Application Category

📝 Abstract
Comparing robotic cloth-manipulation systems in a real-world setup is challenging. The fidelity gap between simulation-trained cloth neural controllers and real-world operation hinders the reliable deployment of these methods in physical trials. Inconsistent experimental setups and hardware limitations among different approaches obstruct objective evaluations. This study demonstrates a reliable real-world comparison of different simulation-trained neural controllers on both flattening and folding tasks with different types of fabrics varying in material, size, and colour. We introduce the DRAPER framework to enable this comprehensive study, which reliably reflects the true capabilities of these neural controllers. It specifically addresses real-world grasping errors, such as misgrasping and multilayer grasping, through real-world adaptations of the simulation environment to provide data trajectories that closely reflect real-world grasping scenarios. It also employs a special set of vision processing techniques to close the simulation-to-reality gap in the perception. Furthermore, it achieves robust grasping by adopting a tweezer-extended gripper and a grasping procedure. We demonstrate DRAPER's generalisability across different deep-learning methods and robotic platforms, offering valuable insights to the cloth manipulation research community.
Problem

Research questions and friction points this paper is trying to address.

Bridging simulation-to-reality gap for cloth manipulation
Ensuring reliable deployment of neural controllers in real-world
Standardizing evaluation of robotic cloth-manipulation systems
Innovation

Methods, ideas, or system contributions that make the work stand out.

DRAPER framework bridges simulation-to-reality gap
Adapts simulation for real-world grasping scenarios
Uses vision techniques and tweezer-extended gripper
🔎 Similar Papers
No similar papers found.
H
Halid Abdulrahim Kadi
School of Computer Science, University of St Andrews
J
Jose Alex Chandy
School of Computer Science, University of Nottingham
Luis Figueredo
Luis Figueredo
University of Nottingham
RoboticsControlHRI
K
Kasim Terzi'c
School of Computer Science, University of St Andrews
P
P. Caleb-Solly
School of Computer Science, University of Nottingham