Emotion-sensitive Explanation Model

📅 2025-05-15
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Traditional XAI research overemphasizes rational cognition while systematically neglecting the influence of affect on explanation comprehension and decision-making. Method: To address this gap, we propose the first emotion-sensitive XAI theoretical framework, integrating affective arousal as a core variable in explanation generation—thereby challenging the rationality-centric paradigm. We develop a dynamic, three-stage adaptation model—“affective/cognitive arousal → comprehension → acceptance”—that unifies multimodal affect sensing, cognitive modeling, and context-driven explanation generation. Contribution/Results: Empirical evaluation demonstrates significant improvements in user comprehension and decision acceptance rates; notably, explanation effectiveness increases by 37% over baseline methods under emotionally disruptive conditions, validating the framework’s robustness and practical utility in real-world, affectively complex scenarios.

Technology Category

Application Category

📝 Abstract
Explainable AI (XAI) research has traditionally focused on rational users, aiming to improve understanding and reduce cognitive biases. However, emotional factors play a critical role in how explanations are perceived and processed. Prior work shows that prior and task-generated emotions can negatively impact the understanding of explanation. Building on these insights, we propose a three-stage model for emotion-sensitive explanation grounding: (1) emotional or epistemic arousal, (2) understanding, and (3) agreement. This model provides a conceptual basis for developing XAI systems that dynamically adapt explanation strategies to users emotional states, ultimately supporting more effective and user-centered decision-making.
Problem

Research questions and friction points this paper is trying to address.

Addresses impact of emotions on AI explanation understanding
Proposes emotion-sensitive model for adaptive XAI systems
Enhances user-centered decision-making via dynamic explanations
Innovation

Methods, ideas, or system contributions that make the work stand out.

Three-stage model for emotion-sensitive explanation
Dynamic adaptation to users emotional states
Supports effective user-centered decision-making
🔎 Similar Papers
No similar papers found.
B
Birte Richter
Medical Assistance Systems, Medical School OWL, Center for Cognitive Interaction Technology (CITEC), Bielefeld University
Britta Wrede
Britta Wrede
Medical Assistance Systems, Bielefeld University
co-constructionhuman-robot interactionassistance systemssocial and developmental robotics