Flow with the Force Field: Learning 3D Compliant Flow Matching Policies from Force and Demonstration-Guided Simulation Data

📅 2025-10-03
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the challenges of insufficient force awareness, poor compliance, and Sim2Real transfer difficulty in contact-intensive robotic manipulation, this paper proposes a force-augmented visual imitation learning framework. Methodologically, it introduces (1) a force-field-guided approach that synthesizes force-aware simulation data from a single human demonstration; (2) a cross-modal imitation learning architecture integrating visual and force-tactile observations; and (3) a policy network based on 3D compliant flow matching, explicitly modeling contact dynamics and environmental adaptability. Evaluated on real-robot non-prehensile block flipping and bimanual object relocation tasks, the method achieves stable, continuous contact control, significantly improving robustness and cross-scenario generalization. It effectively bridges the sim-to-real performance gap, demonstrating superior transferability without requiring real-world force labeling or extensive domain randomization.

Technology Category

Application Category

📝 Abstract
While visuomotor policy has made advancements in recent years, contact-rich tasks still remain a challenge. Robotic manipulation tasks that require continuous contact demand explicit handling of compliance and force. However, most visuomotor policies ignore compliance, overlooking the importance of physical interaction with the real world, often leading to excessive contact forces or fragile behavior under uncertainty. Introducing force information into vision-based imitation learning could help improve awareness of contacts, but could also require a lot of data to perform well. One remedy for data scarcity is to generate data in simulation, yet computationally taxing processes are required to generate data good enough not to suffer from the Sim2Real gap. In this work, we introduce a framework for generating force-informed data in simulation, instantiated by a single human demonstration, and show how coupling with a compliant policy improves the performance of a visuomotor policy learned from synthetic data. We validate our approach on real-robot tasks, including non-prehensile block flipping and a bi-manual object moving, where the learned policy exhibits reliable contact maintenance and adaptation to novel conditions. Project Website: https://flow-with-the-force-field.github.io/webpage/
Problem

Research questions and friction points this paper is trying to address.

Learning compliant robotic manipulation policies from force data
Addressing Sim2Real gap in contact-rich task training
Improving visuomotor policies with force-informed simulation data
Innovation

Methods, ideas, or system contributions that make the work stand out.

Generates force-informed simulation data from single demonstration
Learns compliant visuomotor policies from synthetic force data
Combines force fields with demonstration-guided simulation training
🔎 Similar Papers
No similar papers found.