🤖 AI Summary
Existing robotic manipulation datasets predominantly focus on simple rearrangement tasks, failing to capture the complex contact-rich physical dynamics inherent in industrial assembly and disassembly. To address this gap, we introduce REASSEMBLE—the first multimodal benchmark dataset specifically designed for contact-intensive assembly and disassembly tasks. Built upon the NIST Assembly Task Board 1, it encompasses four fundamental actions—pick, insert, remove, and place—across 17 distinct objects, with 4,035 successful demonstrations (781 minutes total). Data is synchronously captured from event cameras, 6-DoF force-torque sensors, microphones, and multi-view RGB cameras. Each trajectory is meticulously segmented, annotated with object states and contact phases, and labeled for success. REASSEMBLE is the first dataset to systematically cover the full contact dynamics cycle of assembly/disassembly, enabling force-vision co-modeling, robust action generalization, and multimodal policy learning. The complete dataset is publicly released.
📝 Abstract
Robotic manipulation remains a core challenge in robotics, particularly for contact-rich tasks such as industrial assembly and disassembly. Existing datasets have significantly advanced learning in manipulation but are primarily focused on simpler tasks like object rearrangement, falling short of capturing the complexity and physical dynamics involved in assembly and disassembly. To bridge this gap, we present REASSEMBLE (Robotic assEmbly disASSEMBLy datasEt), a new dataset designed specifically for contact-rich manipulation tasks. Built around the NIST Assembly Task Board 1 benchmark, REASSEMBLE includes four actions (pick, insert, remove, and place) involving 17 objects. The dataset contains 4,551 demonstrations, of which 4,035 were successful, spanning a total of 781 minutes. Our dataset features multi-modal sensor data including event cameras, force-torque sensors, microphones, and multi-view RGB cameras. This diverse dataset supports research in areas such as learning contact-rich manipulation, task condition identification, action segmentation, and more. We believe REASSEMBLE will be a valuable resource for advancing robotic manipulation in complex, real-world scenarios. The dataset is publicly available on our project website: https://dsliwowski1.github.io/REASSEMBLE_page.