Beyond Technocratic XAI: The Who, What & How in Explanation Design

📅 2025-08-12
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Current XAI practices face a core challenge: explanation generation is inherently context-dependent, yet most technical approaches neglect user background, needs, and delivery modalities—undermining accessibility, transparency, and ethical accountability. This paper transcends the technology-centric paradigm by systematically integrating design thinking for the first time, proposing a triadic, context-aware explanation design framework grounded in “for whom,” “what to explain,” and “how to convey.” Drawing on human-computer interaction principles, socio-technical systems theory, and contextual analysis methods, the framework internalizes cognitive fairness, social justice, and accountable governance as core design dimensions. It provides practitioners with an actionable guide for developing XAI systems, significantly enhancing explanation comprehensibility, fairness, and responsibility. Ultimately, it advances XAI’s evolution from a technical artifact toward a responsible socio-technical practice. (149 words)

Technology Category

Application Category

📝 Abstract
The field of Explainable AI (XAI) offers a wide range of techniques for making complex models interpretable. Yet, in practice, generating meaningful explanations is a context-dependent task that requires intentional design choices to ensure accessibility and transparency. This paper reframes explanation as a situated design process -- an approach particularly relevant for practitioners involved in building and deploying explainable systems. Drawing on prior research and principles from design thinking, we propose a three-part framework for explanation design in XAI: asking Who needs the explanation, What they need explained, and How that explanation should be delivered. We also emphasize the need for ethical considerations, including risks of epistemic inequality, reinforcing social inequities, and obscuring accountability and governance. By treating explanation as a sociotechnical design process, this framework encourages a context-aware approach to XAI that supports effective communication and the development of ethically responsible explanations.
Problem

Research questions and friction points this paper is trying to address.

Addressing context-dependent meaningful explanations in XAI
Proposing a framework for Who, What, How in explanation design
Incorporating ethical considerations to avoid social inequities
Innovation

Methods, ideas, or system contributions that make the work stand out.

Situated design process for XAI explanations
Three-part framework: Who, What, How
Ethical considerations in explanation design