IXAII: An Interactive Explainable Artificial Intelligence Interface for Decision Support Systems

📅 2025-06-26
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing explainable AI (XAI) methods are predominantly static, overlooking user diversity and interactive requirements, thereby limiting explanatory effectiveness. To address this, we propose IXAII—the first integrated XAI system framework that unifies multiple explanation techniques, role-aware adaptation, and real-time interactivity. IXAII incorporates four state-of-the-art local and counterfactual explanation methods—LIME, SHAP, Anchors, and DiCE—and supports on-demand customization of explanation content and visualization formats for five distinct user roles. Through synergistic design of user modeling, multi-view visualizations, and an interactive frontend, IXAII bridges the gap among algorithmic interpretability, human cognitive constraints, and practical deployment needs. Empirical evaluations with domain experts and non-expert users demonstrate that IXAII’s multi-explanation orchestration and format controllability significantly enhance decision transparency, model trustworthiness, and depth of user understanding.

Technology Category

Application Category

📝 Abstract
Although several post-hoc methods for explainable AI have been developed, most are static and neglect the user perspective, limiting their effectiveness for the target audience. In response, we developed the interactive explainable intelligent system called IXAII that offers explanations from four explainable AI methods: LIME, SHAP, Anchors, and DiCE. Our prototype provides tailored views for five user groups and gives users agency over the explanations' content and their format. We evaluated IXAII through interviews with experts and lay users. Our results indicate that IXAII, which provides different explanations with multiple visualization options, is perceived as helpful to increase transparency. By bridging the gaps between explainable AI methods, interactivity, and practical implementation, we provide a novel perspective on AI explanation practices and human-AI interaction.
Problem

Research questions and friction points this paper is trying to address.

Develops interactive explainable AI for decision support
Addresses static methods lacking user perspective
Provides tailored explanations for diverse user groups
Innovation

Methods, ideas, or system contributions that make the work stand out.

Interactive explainable AI system IXAII
Combines LIME, SHAP, Anchors, DiCE methods
Tailored views for diverse user groups
🔎 Similar Papers
No similar papers found.