MECO: A Multimodal Dataset for Emotion and Cognitive Understanding in Older Adults

📅 2026-04-03
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study addresses the gap in existing emotion recognition datasets, which typically overlook the impact of cognitive decline on emotional expression and physiological responses in older adults. To this end, we introduce MECO, the first multimodal dataset specifically designed for community-dwelling elderly individuals. Collected in real-world settings, MECO comprises approximately 38 hours of synchronized video, audio, EEG, and ECG signals from 42 participants, yielding 30,592 annotated samples. Each sample is jointly labeled with dimensional and categorical emotion annotations alongside Mini-Mental State Examination (MMSE) scores to reflect cognitive status. By integrating cognitive assessment with multimodal affective data, MECO enables research into personalized emotion recognition and early detection of mild cognitive impairment, while also providing baseline models and standardized protocols to support future studies in this underexplored domain.
📝 Abstract
While affective computing has advanced considerably, multimodal emotion prediction in aging populations remains underexplored, largely due to the scarcity of dedicated datasets. Existing multimodal benchmarks predominantly target young, cognitively healthy subjects, neglecting the influence of cognitive decline on emotional expression and physiological responses. To bridge this gap, we present MECO, a Multimodal dataset for Emotion and Cognitive understanding in Older adults. MECO includes 42 participants and provides approximately 38 hours of multimodal signals, yielding 30,592 synchronized samples. To maximize ecological validity, data collection followed standardized protocols within community-based settings. The modalities cover video, audio, electroencephalography (EEG), and electrocardiography (ECG). In addition, the dataset offers comprehensive annotations of emotional and cognitive states, including self-assessed valence, arousal, six basic emotions, and Mini-Mental State Examination cognitive scores. We further establish baseline benchmarks for both emotion and cognitive prediction. MECO serves as a foundational resource for multimodal modeling of affect and cognition in aging populations, facilitating downstream applications such as personalized emotion recognition and early detection of mild cognitive impairment (MCI) in real-world settings. The complete dataset and supplementary materials are available at https://maitrechen.github.io/meco-page/.
Problem

Research questions and friction points this paper is trying to address.

multimodal emotion prediction
aging populations
cognitive decline
emotion recognition
mild cognitive impairment
Innovation

Methods, ideas, or system contributions that make the work stand out.

multimodal dataset
emotion recognition
cognitive assessment
older adults
ecological validity
🔎 Similar Papers
No similar papers found.
H
Hongbin Chen
Nanjing Medical University
J
Jie Li
Nanjing Medical University
W
Wei Wang
Nanjing Medical University
Siyang Song
Siyang Song
Lecturer (AP), University of Exeter
Social Signal ProcessingAffective ComputingMachine LearningHuman-Computer Interaction
Xiao Gu
Xiao Gu
University of Oxford
AI for HealthcareBiomedical Signal ProcessingWearable/Ambient IntelligenceDeep Learning
J
Jianqing Li
Nanjing Medical University
W
Wentao Xiang
Nanjing Medical University