BEAM: Brainwave Empathy Assessment Model for Early Childhood

📅 2025-09-08
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing empathy assessments for preschool children rely on subjective reports or observational annotations, introducing significant bias; moreover, conventional EEG analysis extracts only static features, failing to capture the dynamic nature of empathy development in 4–6-year-olds. To address these limitations, we propose BEAM—a novel deep learning framework leveraging multi-view electroencephalography (EEG). BEAM uniquely integrates spatiotemporal dynamic modeling (via a LaBraM encoder), multi-view signal fusion, and contrastive learning to separately characterize the evolving cognitive and affective dimensions of empathy. Crucially, it eliminates dependence on subjective labeling, enabling objective, fine-grained quantification of empathy levels. Evaluated on the CBCP dataset, BEAM achieves state-of-the-art performance across accuracy, F1-score, and cross-subject generalization. Furthermore, its interpretable architecture provides a deployable neural biomarker for early intervention in socioemotional development.

Technology Category

Application Category

📝 Abstract
Empathy in young children is crucial for their social and emotional development, yet predicting it remains challenging. Traditional methods often only rely on self-reports or observer-based labeling, which are susceptible to bias and fail to objectively capture the process of empathy formation. EEG offers an objective alternative; however, current approaches primarily extract static patterns, neglecting temporal dynamics. To overcome these limitations, we propose a novel deep learning framework, the Brainwave Empathy Assessment Model (BEAM), to predict empathy levels in children aged 4-6 years. BEAM leverages multi-view EEG signals to capture both cognitive and emotional dimensions of empathy. The framework comprises three key components: 1) a LaBraM-based encoder for effective spatio-temporal feature extraction, 2) a feature fusion module to integrate complementary information from multi-view signals, and 3) a contrastive learning module to enhance class separation. Validated on the CBCP dataset, BEAM outperforms state-of-the-art methods across multiple metrics, demonstrating its potential for objective empathy assessment and providing a preliminary insight into early interventions in children's prosocial development.
Problem

Research questions and friction points this paper is trying to address.

Predicting empathy levels in young children objectively
Overcoming bias in traditional empathy assessment methods
Capturing temporal dynamics in EEG-based empathy analysis
Innovation

Methods, ideas, or system contributions that make the work stand out.

LaBraM-based encoder for spatio-temporal EEG features
Multi-view feature fusion module integration
Contrastive learning module for class separation enhancement
🔎 Similar Papers
No similar papers found.
Chen Xie
Chen Xie
Politecnico di Torino
Synthesis of smart sensors
G
Gaofeng Wu
School of Biomedical Engineering, ShanghaiTech University, Shanghai, China
K
Kaidong Wang
School of Biomedical Engineering, ShanghaiTech University, Shanghai, China
Zihao Zhu
Zihao Zhu
The Chinese University of Hong Kong, Shenzhen
AI securityLarge language modelsAgentEmbodied AI
X
Xiaoshu Luo
School of Biomedical Engineering, ShanghaiTech University, Shanghai, China
Yan Liang
Yan Liang
Northwestern Polytechnical University
Information fusionState EstimationTarget tracking
F
Feiyu Quan
School of Biomedical Engineering, ShanghaiTech University, Shanghai, China
R
Ruoxi Wu
Ministry of Brain Functional Genomics (MOE&STCSM), School of Psychology and Cognitive Science, East China Normal University, Shanghai, China
X
Xianghui Huang
Children’s Hospital of Fudan University (Xiamen Branch), Xiamen Children’s Hospital, Fujian Key Laboratory of Neonatal Diseases, Xiamen, Fujian, China
H
Han Zhang
School of Biomedical Engineering, ShanghaiTech University, Shanghai 201210, China; State Key Laboratory of Advanced Medical Materials and Devices, ShanghaiTech University, Shanghai 201210, China; Shanghai Clinical Research and Trial Center, Shanghai 201210, China