🤖 AI Summary
Existing motion generation models often neglect individual body shape variations, relying instead on a generic average human template—leading to physically implausible and anatomically homogeneous motions. To address this, we propose the first generative motion model conditioned on 3D body shape (parameterized by SMPL-X), enabling body-aware motion synthesis without requiring paired motion-capture data. Our approach jointly models the coupling between body shape and motion dynamics, incorporating a cycle-consistency loss, physics-based constraints derived from kinematics and dynamics, and stability regularization. Quantitative evaluation demonstrates consistent superiority over state-of-the-art methods across standard metrics—including FID, Jitter, and Diversity. Qualitative analysis further confirms anatomical plausibility, inertial consistency, and strong cross-body-shape generalization. Overall, our method significantly enhances both the physical realism and inter-individual diversity of synthesized human motion.
📝 Abstract
Generating realistic human motion is essential for many computer vision and graphics applications. The wide variety of human body shapes and sizes greatly impacts how people move. However, most existing motion models ignore these differences, relying on a standardized, average body. This leads to uniform motion across different body types, where movements don't match their physical characteristics, limiting diversity. To solve this, we introduce a new approach to develop a generative motion model based on body shape. We show that it's possible to train this model using unpaired data by applying cycle consistency, intuitive physics, and stability constraints, which capture the relationship between identity and movement. The resulting model generates diverse, physically plausible, and dynamically stable human motions that are both quantitatively and qualitatively more realistic than current state-of-the-art methods. More details are available on our project page https://CarstenEpic.github.io/humos/.