PP-Motion: Physical-Perceptual Fidelity Evaluation for Human Motion Generation

πŸ“… 2025-08-11
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
This paper addresses the challenge of simultaneously ensuring physical feasibility and perceptual realism in human motion generation evaluation. To this end, we propose PP-Motionβ€”the first end-to-end assessment framework that jointly incorporates physical priors and human perceptual modeling. Methodologically, we construct objective ground truth via physics-based annotation and optimize a dual-constraint loss comprising Pearson correlation loss (to enforce dynamical consistency) and perceptual fidelity loss (trained on human motion discrimination). Our key contributions are: (i) introducing minimal physical correction magnitude as a continuous, fine-grained evaluation metric; and (ii) unifying physical plausibility and subjective realism within a single differentiable framework. Experiments demonstrate that PP-Motion significantly outperforms existing metrics in physical consistency and achieves strong correlation (r > 0.92) with human perceptual quality scores. Thus, PP-Motion establishes a novel, interpretable, and differentiable paradigm for assessing generated motion quality.

Technology Category

Application Category

πŸ“ Abstract
Human motion generation has found widespread applications in AR/VR, film, sports, and medical rehabilitation, offering a cost-effective alternative to traditional motion capture systems. However, evaluating the fidelity of such generated motions is a crucial, multifaceted task. Although previous approaches have attempted at motion fidelity evaluation using human perception or physical constraints, there remains an inherent gap between human-perceived fidelity and physical feasibility. Moreover, the subjective and coarse binary labeling of human perception further undermines the development of a robust data-driven metric. We address these issues by introducing a physical labeling method. This method evaluates motion fidelity by calculating the minimum modifications needed for a motion to align with physical laws. With this approach, we are able to produce fine-grained, continuous physical alignment annotations that serve as objective ground truth. With these annotations, we propose PP-Motion, a novel data-driven metric to evaluate both physical and perceptual fidelity of human motion. To effectively capture underlying physical priors, we employ Pearson's correlation loss for the training of our metric. Additionally, by incorporating a human-based perceptual fidelity loss, our metric can capture fidelity that simultaneously considers both human perception and physical alignment. Experimental results demonstrate that our metric, PP-Motion, not only aligns with physical laws but also aligns better with human perception of motion fidelity than previous work.
Problem

Research questions and friction points this paper is trying to address.

Bridging human-perceived fidelity and physical feasibility in motion generation
Developing objective ground truth for continuous physical alignment annotations
Creating a metric evaluating both physical and perceptual motion fidelity
Innovation

Methods, ideas, or system contributions that make the work stand out.

Physical labeling method for motion fidelity
PP-Motion metric combines physical and perceptual fidelity
Pearson's correlation loss captures physical priors
πŸ”Ž Similar Papers
No similar papers found.
S
Sihan Zhao
Tsinghua University, Beijing, China
Z
Zixuan Wang
Tsinghua University, Beijing, China
Tianyu Luan
Tianyu Luan
University at Buffalo
3D vision
J
Jia Jia
BNRist, Tsinghua University, Key Laboratory of Pervasive Computing, Ministry of Education, Beijing, China
W
Wentao Zhu
Eastern Institute of Technology, Ningbo, Ningbo, China
J
Jiebo Luo
University of Rochester, Rochester, NY, USA
Junsong Yuan
Junsong Yuan
State University of New York at Buffalo
computer visionvideo analyticsaction and gesture analysismultimediapattern recognition
Nan Xi
Nan Xi
University at Buffalo
Computer VisionPattern RecognitionMedical AI