Morphology-Independent Facial Expression Imitation for Human-Face Robots

📅 2026-03-07
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing facial robotic expression imitation methods are highly sensitive to variations in facial morphology, often resulting in distorted expressions. To address this limitation, this work proposes a morphology-invariant, high-fidelity expression imitation approach that, within a self-supervised framework, achieves the first successful disentanglement of expressive semantics from morphological features. Building upon this disentanglement, the method establishes a cross-morphology transfer mechanism that directly maps perceived expression errors to robotic actuation commands. This strategy significantly enhances the robustness and realism of facial expression reproduction, consistently outperforming state-of-the-art methods across multiple evaluation tasks. Experiments are conducted on Pengrui, a custom-built, highly expressive humanoid robot platform, and the implementation code along with technical details will be made publicly available.

Technology Category

Application Category

📝 Abstract
Accurate facial expression imitation on human-face robots is crucial for achieving natural human-robot interaction. Most existing methods have achieved photorealistic expression imitation through mapping 2D facial landmarks to a robot's actuator commands. Their imitation of landmark trajectories is susceptible to interference from facial morphology, which would lead to a performance drop. In this paper, we propose a morphology-independent expression imitation method that decouples expressions from facial morphology to eliminate morphological influence and produce more realistic expressions for human-face robots. Specifically, we construct an expression decoupling module to learn expression semantics by disentangling the expression representation from the morphology representation in a self-supervised manner. We devise an expression transfer module to map the representations to the robot's actuator commands through a learning objective of perceiving expression errors, producing accurate facial expressions based on the learned expression semantics. To support experimental validation, a custom-designed and highly expressive human-face robot, namely Pengrui, is developed to serve as an experimental platform for realistic expression imitation. Extensive experiments demonstrate that our method enables the human-face robot to reproduce a wide range of human-like expressions effectively. All code and implementation details of the robot will be released.
Problem

Research questions and friction points this paper is trying to address.

facial expression imitation
human-face robots
morphology interference
expression realism
facial morphology
Innovation

Methods, ideas, or system contributions that make the work stand out.

morphology-independent
expression decoupling
self-supervised learning
facial expression imitation
human-face robot
🔎 Similar Papers
No similar papers found.
X
Xu Chen
R
Rui Gao
C
Che Sun
Z
Zhehang Liu
Yuwei Wu
Yuwei Wu
Ph.D. candidate, GRASP Lab, University of Pennsylvania
RoboticsTrajectory OptimizationTask and Motion Planning
S
Shuo Yang
Y
Yunde Jia