🤖 AI Summary
Existing MER methods are constrained by predefined emotion categories, limiting their ability to capture the complexity and fine-grained distinctions inherent in human affect. This paper introduces Open-Vocabulary Multimodal Emotion Recognition (OV-MER), a novel paradigm that pioneers zero-shot open-vocabulary learning for MER—enabling semantic-driven recognition of unseen, compositional, and psychologically grounded non-basic emotions (e.g., “bittersweet”, “awe”). Our contributions include: (1) the first open-vocabulary MER benchmark dataset; (2) a new evaluation metric based on semantic similarity; and (3) a zero-shot generalization architecture integrating cross-modal alignment, CLIP-style contrastive learning, and semantic embedding space mapping. Experiments demonstrate substantial improvements in fine-grained emotion classification accuracy and semantic plausibility, while confirming strong cross-category generalization capability.
📝 Abstract
Multimodal Emotion Recognition (MER) is a critical research area that seeks to decode human emotions from diverse data modalities. However, existing machine learning methods predominantly rely on predefined emotion taxonomies, which fail to capture the inherent complexity, subtlety, and multi-appraisal nature of human emotional experiences, as demonstrated by studies in psychology and cognitive science. To overcome this limitation, we advocate for introducing the concept of open vocabulary into MER. This paradigm shift aims to enable models to predict emotions beyond a fixed label space, accommodating a flexible set of categories to better reflect the nuanced spectrum of human emotions. To achieve this, we propose a novel paradigm: Open-Vocabulary MER (OV-MER), which enables emotion prediction without being confined to predefined spaces. However, constructing a dataset that encompasses the full range of emotions for OV-MER is practically infeasible; hence, we present a comprehensive solution including a newly curated database, novel evaluation metrics, and a preliminary benchmark. By advancing MER from basic emotions to more nuanced and diverse emotional states, we hope this work can inspire the next generation of MER, enhancing its generalizability and applicability in real-world scenarios.