ReGen: Generative Robot Simulation via Inverse Design

📅 2025-11-06
🏛️ International Conference on Learning Representations
📈 Citations: 3
Influential: 0
📄 PDF
🤖 AI Summary
Current robotic simulation environments rely heavily on manual construction, lacking automated and controllable scene generation methods. This paper proposes a generative simulation framework based on causal graph expansion, employing inverse design: given a robot’s objective function or behavioral trajectory—augmented with natural language descriptions—it infers and synthesizes plausible simulation scenes that would elicit such behavior. The method integrates large language models with directed causal graphs to compile multimodal inputs into structured scene graphs, which are then translated into executable symbolic simulation programs. Key contributions include support for counterfactual scene manipulation, multimodal perceptual reasoning, and agent-centric cognitive modeling—particularly for edge-case simulation enhancement. Evaluated on autonomous driving and robotic manipulation tasks, the framework significantly improves simulation diversity, complexity, and generation success rate, thereby enabling robust policy validation and high-quality synthetic data augmentation.

Technology Category

Application Category

📝 Abstract
Simulation plays a key role in scaling robot learning and validating policies, but constructing simulations remains a labor-intensive process. This paper introduces ReGen, a generative simulation framework that automates simulation design via inverse design. Given a robot's behavior -- such as a motion trajectory or an objective function -- and its textual description, ReGen infers plausible scenarios and environments that could have caused the behavior. ReGen leverages large language models to synthesize scenarios by expanding a directed graph that encodes cause-and-effect relationships, relevant entities, and their properties. This structured graph is then translated into a symbolic program, which configures and executes a robot simulation environment. Our framework supports (i) augmenting simulations based on ego-agent behaviors, (ii) controllable, counterfactual scenario generation, (iii) reasoning about agent cognition and mental states, and (iv) reasoning with distinct sensing modalities, such as braking due to faulty GPS signals. We demonstrate ReGen in autonomous driving and robot manipulation tasks, generating more diverse, complex simulated environments compared to existing simulations with high success rates, and enabling controllable generation for corner cases. This approach enhances the validation of robot policies and supports data or simulation augmentation, advancing scalable robot learning for improved generalization and robustness. We provide code and example videos at: https://regen-sim.github.io/
Problem

Research questions and friction points this paper is trying to address.

Automates robot simulation design through inverse generation from behaviors
Constructs plausible scenarios and environments that explain observed robot actions
Enhances robot policy validation with diverse and controllable simulation generation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Generative simulation framework using inverse design
Leverages language models for scenario synthesis
Translates structured graphs into symbolic simulation programs
🔎 Similar Papers
No similar papers found.