MotionPersona: Characteristics-aware Locomotion Control

📅 2025-05-30
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing character controllers struggle to model real-world human diversity, producing homogeneous motions that lack responsiveness to anatomical features, psychological states, and demographic attributes. To address this, we propose the first fine-grained trait-driven real-time gait control system. Our method introduces: (1) a multidimensional conditional framework jointly conditioning on SMPLX pose parameters, textual prompts, and user-defined control signals; (2) a few-shot character profiling technique based on short motion clips, overcoming representational limitations of text-only prompting; and (3) a block-wise autoregressive motion diffusion model, accompanied by a high-quality, publicly released dataset covering diverse traits and gait styles. Experiments demonstrate significant improvements over state-of-the-art methods in trait fidelity, motion quality, and stylistic diversity, while enabling real-time interactive generation. Code, data, and interactive demos are open-sourced.

Technology Category

Application Category

📝 Abstract
We present MotionPersona, a novel real-time character controller that allows users to characterize a character by specifying attributes such as physical traits, mental states, and demographics, and projects these properties into the generated motions for animating the character. In contrast to existing deep learning-based controllers, which typically produce homogeneous animations tailored to a single, predefined character, MotionPersona accounts for the impact of various traits on human motion as observed in the real world. To achieve this, we develop a block autoregressive motion diffusion model conditioned on SMPLX parameters, textual prompts, and user-defined locomotion control signals. We also curate a comprehensive dataset featuring a wide range of locomotion types and actor traits to enable the training of this characteristic-aware controller. Unlike prior work, MotionPersona is the first method capable of generating motion that faithfully reflects user-specified characteristics (e.g., an elderly person's shuffling gait) while responding in real time to dynamic control inputs. Additionally, we introduce a few-shot characterization technique as a complementary conditioning mechanism, enabling customization via short motion clips when language prompts fall short. Through extensive experiments, we demonstrate that MotionPersona outperforms existing methods in characteristics-aware locomotion control, achieving superior motion quality and diversity. Results, code, and demo can be found at: https://motionpersona25.github.io/.
Problem

Research questions and friction points this paper is trying to address.

Generates diverse motions based on user-specified character traits
Real-time control for animations reflecting real-world human motion variations
Few-shot customization when textual prompts are insufficient
Innovation

Methods, ideas, or system contributions that make the work stand out.

Block autoregressive motion diffusion model
SMPLX parameters and textual prompts conditioning
Few-shot characterization technique customization
🔎 Similar Papers
No similar papers found.