🤖 AI Summary
This work addresses the limitations of existing explanation methods in Answer Set Programming (ASP), which are often confined to specific scenarios and fail to accommodate diverse user needs. For the first time, it systematically organizes ASP explanation approaches within the framework of Explainable Artificial Intelligence (XAI), establishing a user-centered taxonomy by mapping types of user queries to the explanatory capabilities of current theoretical tools. The study clarifies gaps and shortcomings in the coverage of existing methods and lays a theoretical foundation for developing more general, user-oriented ASP explanation systems. Furthermore, it outlines a roadmap for future research aimed at enhancing the adaptability and comprehensiveness of explanations in ASP.
📝 Abstract
Answer Set Programming (ASP) is a popular declarative reasoning and problem solving approach in symbolic AI. Its rule-based formalism makes it inherently attractive for explainable and interpretive reasoning, which is gaining importance with the surge of Explainable AI (XAI). A number of explanation approaches and tools for ASP have been developed, which often tackle specific explanatory settings and may not cover all scenarios that ASP users encounter. In this survey, we provide, guided by an XAI perspective, an overview of types of ASP explanations in connection with user questions for explanation, and describe how their coverage by current theory and tools. Furthermore, we pinpoint gaps in existing ASP explanations approaches and identify research directions for future work.