CQD-SHAP: Explainable Complex Query Answering via Shapley Values

📅 2025-10-17
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing complex query answering (CQA) methods on incomplete knowledge graphs—particularly neural-symbolic models such as CQD—lack interpretability, as they cannot quantify the contribution of individual query components to answer ranking. Method: This paper proposes the first Shapley-value-based neural-symbolic framework for explainable CQA. It integrates CQD’s reasoning process with a rigorous, axiomatically grounded attribution mechanism derived from cooperative game theory. Contribution/Results: Our framework enables precise, theoretically justified quantification of subquery contributions across multi-hop reasoning paths. Experiments across diverse complex query types demonstrate significant improvements in explanation necessity and sufficiency—automatically evaluated against multiple baselines—while enhancing model transparency and user trust. To our knowledge, this is the first work to establish a principled, attribution-based paradigm for trustworthy knowledge graph reasoning.

Technology Category

Application Category

📝 Abstract
Complex query answering (CQA) goes beyond the well-studied link prediction task by addressing more sophisticated queries that require multi-hop reasoning over incomplete knowledge graphs (KGs). Research on neural and neurosymbolic CQA methods is still an emerging field. Almost all of these methods can be regarded as black-box models, which may raise concerns about user trust. Although neurosymbolic approaches like CQD are slightly more interpretable, allowing intermediate results to be tracked, the importance of different parts of the query remains unexplained. In this paper, we propose CQD-SHAP, a novel framework that computes the contribution of each query part to the ranking of a specific answer. This contribution explains the value of leveraging a neural predictor that can infer new knowledge from an incomplete KG, rather than a symbolic approach relying solely on existing facts in the KG. CQD-SHAP is formulated based on Shapley values from cooperative game theory and satisfies all the fundamental Shapley axioms. Automated evaluation of these explanations in terms of necessary and sufficient explanations, and comparisons with various baselines, shows the effectiveness of this approach for most query types.
Problem

Research questions and friction points this paper is trying to address.

Explaining contributions of query parts in complex reasoning
Addressing black-box nature of neural knowledge graph models
Quantifying importance of neural inference over symbolic facts
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses Shapley values for query part contributions
Explains neural predictions on incomplete knowledge graphs
Satisfies fundamental axioms from cooperative game theory
🔎 Similar Papers
No similar papers found.