Finding Uncommon Ground: A Human-Centered Model for Extrospective Explanations

📅 2025-07-29
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing AI explanation methods primarily focus on internal model mechanics, failing to meet the comprehension needs of non-expert users. Method: This paper proposes a human-centered, outward-facing explanation framework whose core innovation is constructing an agent’s worldview model as personalized memory. By modeling user cognitive states and contextual factors through interaction history, the framework dynamically identifies knowledge gaps and generates novel, contextually relevant explanations. Technically, it integrates context-aware reasoning, cognitive boundary estimation, and dynamic memory mechanisms to enable real-time filtering and customized explanation generation. Contribution/Results: Experiments demonstrate that the system significantly improves non-expert users’ understanding of and trust in AI decisions. It is the first approach to achieve sustainable learning of individual users’ cognitive boundaries and deliver truly personalized, adaptive explanations.

Technology Category

Application Category

📝 Abstract
The need for explanations in AI has, by and large, been driven by the desire to increase the transparency of black-box machine learning models. However, such explanations, which focus on the internal mechanisms that lead to a specific output, are often unsuitable for non-experts. To facilitate a human-centered perspective on AI explanations, agents need to focus on individuals and their preferences as well as the context in which the explanations are given. This paper proposes a personalized approach to explanation, where the agent tailors the information provided to the user based on what is most likely pertinent to them. We propose a model of the agent's worldview that also serves as a personal and dynamic memory of its previous interactions with the same user, based on which the artificial agent can estimate what part of its knowledge is most likely new information to the user.
Problem

Research questions and friction points this paper is trying to address.

Addressing non-expert needs in AI explanations
Personalizing explanations based on user context
Building dynamic memory for tailored user interactions
Innovation

Methods, ideas, or system contributions that make the work stand out.

Human-centered personalized explanation approach
Dynamic memory for user interaction history
Worldview model to estimate relevant information