🤖 AI Summary
This study addresses the gap in existing emotion recognition datasets, which typically overlook the impact of cognitive decline on emotional expression and physiological responses in older adults. To this end, we introduce MECO, the first multimodal dataset specifically designed for community-dwelling elderly individuals. Collected in real-world settings, MECO comprises approximately 38 hours of synchronized video, audio, EEG, and ECG signals from 42 participants, yielding 30,592 annotated samples. Each sample is jointly labeled with dimensional and categorical emotion annotations alongside Mini-Mental State Examination (MMSE) scores to reflect cognitive status. By integrating cognitive assessment with multimodal affective data, MECO enables research into personalized emotion recognition and early detection of mild cognitive impairment, while also providing baseline models and standardized protocols to support future studies in this underexplored domain.
📝 Abstract
While affective computing has advanced considerably, multimodal emotion prediction in aging populations remains underexplored, largely due to the scarcity of dedicated datasets. Existing multimodal benchmarks predominantly target young, cognitively healthy subjects, neglecting the influence of cognitive decline on emotional expression and physiological responses. To bridge this gap, we present MECO, a Multimodal dataset for Emotion and Cognitive understanding in Older adults. MECO includes 42 participants and provides approximately 38 hours of multimodal signals, yielding 30,592 synchronized samples. To maximize ecological validity, data collection followed standardized protocols within community-based settings. The modalities cover video, audio, electroencephalography (EEG), and electrocardiography (ECG). In addition, the dataset offers comprehensive annotations of emotional and cognitive states, including self-assessed valence, arousal, six basic emotions, and Mini-Mental State Examination cognitive scores. We further establish baseline benchmarks for both emotion and cognitive prediction. MECO serves as a foundational resource for multimodal modeling of affect and cognition in aging populations, facilitating downstream applications such as personalized emotion recognition and early detection of mild cognitive impairment (MCI) in real-world settings. The complete dataset and supplementary materials are available at https://maitrechen.github.io/meco-page/.