๐ค AI Summary
This work addresses the critical challenge of generating large-scale, high-fidelity articulated objects (e.g., hinge- and slide-based mechanisms) required for embodied AI. We propose the first parameterized, physics-aware generative framework specifically designed for articulated structures. Unlike prior methods constrained by limited data volume, noisy annotations, or low simulation fidelity, our approach integrates geometric constraint solving, rigid-body dynamics modeling, parametric CAD primitives, and differentiable rendering optimization to achieve photorealistic, editable, and scalable procedural generation. Quantitative evaluation and user studies demonstrate significant improvements over state-of-the-art methods. Moreover, synthetic data generated by our framework substantially enhances the performance of downstream 3D generative modelsโvalidating its efficacy and practical utility as a high-quality synthetic data source.
๐ Abstract
Large-scale articulated objects with high quality are desperately needed for multiple tasks related to embodied AI. Most existing methods for creating articulated objects are either data-driven or simulation based, which are limited by the scale and quality of the training data or the fidelity and heavy labour of the simulation. In this paper, we propose Infinite Mobility, a novel method for synthesizing high-fidelity articulated objects through procedural generation. User study and quantitative evaluation demonstrate that our method can produce results that excel current state-of-the-art methods and are comparable to human-annotated datasets in both physics property and mesh quality. Furthermore, we show that our synthetic data can be used as training data for generative models, enabling next-step scaling up. Code is available at https://github.com/Intern-Nexus/Infinite-Mobility