🤖 AI Summary
Existing language-guided robotic manipulation methods rely heavily on action-annotated data and struggle with complex scenarios—such as deformable objects, severe occlusions, and non-object-centric motions—due to their predominant object-centric optical flow prediction paradigms. To address these limitations, we propose Embodied-Centric Flow (EC-Flow), a novel framework that models the robot’s embodied dynamics as the interaction core and integrates kinematic priors to enhance generalization to deformation, occlusion, and non-rigid motion. We further introduce a target-alignment module for language-vision co-optimization, and unify EC-Flow prediction, target image generation, motion consistency constraints, and URDF-driven kinematic transformation into an end-to-end video-to-action learning pipeline. Experiments in both simulation and real-world settings demonstrate substantial improvements: +62% in occlusion handling, +45% in deformable object manipulation, and +80% in non-object-centric motion tasks—achieving state-of-the-art performance.
📝 Abstract
Current language-guided robotic manipulation systems often require low-level action-labeled datasets for imitation learning. While object-centric flow prediction methods mitigate this issue, they remain limited to scenarios involving rigid objects with clear displacement and minimal occlusion. In this work, we present Embodiment-Centric Flow (EC-Flow), a framework that directly learns manipulation from action-unlabeled videos by predicting embodiment-centric flow. Our key insight is that incorporating the embodiment's inherent kinematics significantly enhances generalization to versatile manipulation scenarios, including deformable object handling, occlusions, and non-object-displacement tasks. To connect the EC-Flow with language instructions and object interactions, we further introduce a goal-alignment module by jointly optimizing movement consistency and goal-image prediction. Moreover, translating EC-Flow to executable robot actions only requires a standard robot URDF (Unified Robot Description Format) file to specify kinematic constraints across joints, which makes it easy to use in practice. We validate EC-Flow on both simulation (Meta-World) and real-world tasks, demonstrating its state-of-the-art performance in occluded object handling (62% improvement), deformable object manipulation (45% improvement), and non-object-displacement tasks (80% improvement) than prior state-of-the-art object-centric flow methods. For more information, see our project website at https://ec-flow1.github.io .