SSFold: Learning to Fold Arbitrary Crumpled Cloth Using Graph Dynamics From Human Demonstration

📅 2024-10-24
🏛️ IEEE Transactions on Automation Science and Engineering
📈 Citations: 3
Influential: 0
📄 PDF
🤖 AI Summary
Robotic cloth folding faces core challenges including high-dimensional deformation, self-occlusion, and the sim-to-real gap; existing methods are typically task-specific and exhibit poor generalization. This paper proposes the first end-to-end dual-stream deep policy network: a temporal stream plans grasp-and-release poses over time, while a spatial stream reconstructs the full cloth geometry from partial point clouds by constructing a visibility graph. Crucially, it is the first to incorporate real human hand demonstrations—enabling zero-shot transfer across cloth attributes (e.g., color, shape, stiffness). The method integrates graph neural networks, point cloud processing, and hand pose tracking. Evaluated on a UR5 platform across six folding tasks, it achieves up to 100% success rate—significantly outperforming state-of-the-art approaches—and unifies smoothing and folding operations, thereby overcoming key practical bottlenecks in deformable object manipulation.

Technology Category

Application Category

📝 Abstract
Robotic cloth manipulation poses significant challenges due to the fabric’s complex dynamics and the high dimensionality of configuration spaces. Previous approaches have focused on isolated smoothing or folding tasks and relied heavily on simulations, often struggling to bridge the sim-to-real gap. This gap arises as simulated cloth dynamics fail to capture real-world properties such as elasticity, friction, and occlusions, causing accuracy loss and limited generalization. To tackle these challenges, we propose a two-stream architecture with sequential and spatial pathways, unifying smoothing and folding tasks into a single adaptable policy model. The sequential stream determines pick-and-place positions, while the spatial stream, using a connectivity dynamics model, constructs a visibility graph from partial point cloud data, enabling the model to infer the cloth’s full configuration despite occlusions. To address the sim-to-real gap, we integrate real-world human demonstration data via a hand-tracking detection algorithm, enhancing real-world performance across diverse cloth configurations. Our method, validated on a UR5 robot across six distinct cloth folding tasks, consistently achieves desired folded states from arbitrary crumpled initial configurations, with success rates of 100.0%, 100.0%, 83.3%, 66.7%, 83.3%, and 66.7%. It outperforms state-of-the-art cloth manipulation techniques and generalizes to unseen fabrics with diverse colors, shapes, and stiffness. Project page: https://zcswdt.github.io/SSFold/ Note to Practitioners—In this paper, we introduce SSFold, a novel framework for robotic cloth manipulation that integrates human demonstrations with advanced learning techniques, providing a practical solution for real-world applications. Practitioners in industries such as textile manufacturing, automated laundry services, and even medical fabric handling can leverage this method to improve operational efficiency and reduce reliance on manual labor significantly. By using a two-stream architecture to handle complex cloth dynamics and self-occlusions, SSFold unifies smoothing and folding tasks into a single policy model, which adapts effectively to diverse fabric types and conditions. A key advantage of this method lies in its ability to utilize low-cost, easy-to-set-up hand tracking systems for human demonstration data collection, reducing the need for expensive, complex setups. The framework has been successfully validated in real-world scenarios with a UR5 robot, achieving high success rates across a variety of folding tasks. Furthermore, the method demonstrates strong generalization capabilities, making it suitable for a wide range of applications beyond the tasks it was initially trained on. This scalability and flexibility offer a highly practical and cost-effective solution for integrating robotic cloth manipulation into various industries.
Problem

Research questions and friction points this paper is trying to address.

Overcoming complex fabric dynamics in robotic cloth manipulation
Bridging sim-to-real gap in deformable object manipulation
Unifying smoothing and folding tasks into adaptable policy
Innovation

Methods, ideas, or system contributions that make the work stand out.

Two-stream architecture for cloth manipulation
Human demonstration data integration
Visibility graph from partial point clouds
🔎 Similar Papers
No similar papers found.
C
Changshi Zhou
Shanghai Research Institute for Intelligent Autonomous Systems, National Key Laboratory of Autonomous Intelligent Unmanned Systems (Tongji University), Frontiers Science Center for Intelligent Autonomous Systems
Haichuan Xu
Haichuan Xu
Shanghai Research Institute for Intelligent Autonomous Systems, National Key Laboratory of Autonomous Intelligent Unmanned Systems (Tongji University), Frontiers Science Center for Intelligent Autonomous Systems
Jiarui Hu
Jiarui Hu
Zhejiang University
Computer Vision Robotics Computer Graphics
Feng Luan
Feng Luan
Asst. Professor of School of EEE, Nanyang Tech. Univ.
Fibre opticsNonlinear fibre optics
Z
Zhipeng Wang
College of Electronics and Information Engineering, Tongji University, National Key Laboratory of Autonomous Intelligent Unmanned Systems (Tongji University), Frontiers Science Center for Intelligent Autonomous Systems
Y
Yanchao Dong
College of Electronics and Information Engineering, Tongji University, National Key Laboratory of Autonomous Intelligent Unmanned Systems (Tongji University), Frontiers Science Center for Intelligent Autonomous Systems
Y
Yanmin Zhou
College of Electronics and Information Engineering, Tongji University, National Key Laboratory of Autonomous Intelligent Unmanned Systems (Tongji University), Frontiers Science Center for Intelligent Autonomous Systems
B
Bin He
College of Electronics and Information Engineering, Tongji University, National Key Laboratory of Autonomous Intelligent Unmanned Systems (Tongji University), Frontiers Science Center for Intelligent Autonomous Systems