Linking Facial Recognition of Emotions and Socially Shared Regulation in Medical Simulation

📅 2025-10-18
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study investigates the synergistic mechanism between emotion recognition—via facial expressions—and socially shared regulation of learning (SSRL) in medical simulation training, focusing on affective–cognitive coupling differences between novices and experts during collaborative virtual diagnostic tasks. Using cross-modal analysis (TMA), it synchronously integrates automated facial expression recognition with discourse data to dynamically model associations between emotional states and SSRL processes. Results reveal that experts exhibit tight coupling between high-arousal emotions (e.g., surprise, anger) and strong SSRL, reflecting focused cognitive engagement; novices, by contrast, show loose associations between low-arousal emotions (e.g., happiness, sadness) and weak SSRL, suggesting cognitive overload or attentional fragmentation. This work is the first to identify a hierarchical interaction pattern between emotional arousal and SSRL intensity, providing quantifiable, multimodal empirical evidence to inform the design of collaborative medical learning environments and competency-based progression assessment.

Technology Category

Application Category

📝 Abstract
Computer-supported simulation enables a practical alternative for medical training purposes. This study investigates the co-occurrence of facial-recognition-derived emotions and socially shared regulation of learning (SSRL) interactions in a medical simulation training context. Using transmodal analysis (TMA), we compare novice and expert learners' affective and cognitive engagement patterns during collaborative virtual diagnosis tasks. Results reveal that expert learners exhibit strong associations between socio-cognitive interactions and high-arousal emotions (surprise, anger), suggesting focused, effortful engagement. In contrast, novice learners demonstrate stronger links between socio-cognitive processes and happiness or sadness, with less coherent SSRL patterns, potentially indicating distraction or cognitive overload. Transmodal analysis of multimodal data (facial expressions and discourse) highlights distinct regulatory strategies between groups, offering methodological and practical insights for computer-supported cooperative work (CSCW) in medical education. Our findings underscore the role of emotion-regulation dynamics in collaborative expertise development and suggest the need for tailored scaffolding to support novice learners' socio-cognitive and affective engagement.
Problem

Research questions and friction points this paper is trying to address.

Investigating emotion-regulation links in medical simulation training
Comparing novice-expert engagement patterns during collaborative diagnosis
Identifying distinct regulatory strategies through multimodal data analysis
Innovation

Methods, ideas, or system contributions that make the work stand out.

Transmodal analysis compares novice and expert engagement patterns
Facial recognition links emotions with socially shared regulation
Multimodal data reveals distinct regulatory strategies between groups
🔎 Similar Papers
No similar papers found.