GenFaceUI: Meta-Design of Generative Personalized Facial Expression Interfaces for Intelligent Agents

📅 2026-02-11
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work proposes the first meta-design framework for generative facial expression interfaces (GPFEI), addressing key challenges in controllability, consistency, and contextual alignment when agents generate facial expressions at runtime. Centered on designers, GPFEI integrates character identity, context-to-expression mapping rules, and a semantic labeling system to support template creation, rule definition, and interactive iteration. Leveraging generative design methodologies and a prototype tool, the framework significantly enhances the controllability and consistency of expression generation. Qualitative evaluation with twelve designers validates its effectiveness and highlights the need for structured visual mechanisms and lightweight interpretability to support design workflows.

Technology Category

Application Category

📝 Abstract
This work investigates generative facial expression interfaces for intelligent agents from a meta-design perspective. We propose the Generative Personalized Facial Expression Interface (GPFEI) framework, which organizes rule-bounded spaces, character identity, and context--expression mapping to address challenges of control, coherence, and alignment in run-time facial expression generation. To operationalize this framework, we developed GenFaceUI, a proof-of-concept tool that enables designers to create templates, apply semantic tags, define rules, and iteratively test outcomes. We evaluated the tool through a qualitative study with twelve designers. The results show perceived gains in controllability and consistency, while revealing needs for structured visual mechanisms and lightweight explanations. These findings provide a conceptual framework, a proof-of-concept tool, and empirical insights that highlight both opportunities and challenges for advancing generative facial expression interfaces within a broader meta-design paradigm.
Problem

Research questions and friction points this paper is trying to address.

facial expression generation
intelligent agents
controllability
coherence
alignment
Innovation

Methods, ideas, or system contributions that make the work stand out.

generative facial expression
meta-design
intelligent agents
context-expression mapping
designer-in-the-loop
🔎 Similar Papers
No similar papers found.
Y
Yate Ge
College of Design and Innovation, Tongji University; School of Design, Southern University of Science and Technology
L
Lin Tian
College of Design and Innovation, Tongji University
Yi Dai
Yi Dai
Ph.D. Candidate, University of Michigan
process controlmodel predictive control
S
Shuhan Pan
University of Washington
Yiwen Zhang
Yiwen Zhang
Huazhong University of Science and Technology
Storage systemKey-value store
Qi Wang
Qi Wang
Associate Professor, College of Design & Innovation, Tongji University
WearablesHuman-Computer InteractionHealthcare DesignSmart TextilesInteraction Design
Weiwei Guo
Weiwei Guo
Tongji University & Shanghai Jiaotong University
Multimodal AI for EO/RSHCIHAIHRI
X
Xiaohua Sun
School of Design, Institute of Robotics Research, Southern University of Science and Technology