GISExplainer: On Explainability of Graph Neural Networks via Game-theoretic Interaction Subgraphs

📅 2024-09-24
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the insufficient interpretability, unclear node-edge collaboration mechanisms, and low explanation fidelity of Graph Neural Networks (GNNs) in critical domains such as healthcare, finance, and cybersecurity, this paper proposes a causal subgraph explanation method grounded in game-theoretic interaction. Our approach models explanations as a causal-driven, sequential subgraph growth process—ensuring both structural connectivity and human comprehensibility. Key contributions include: (1) the first multi-granularity coalition interaction causal attribution framework, explicitly capturing positive and negative collaborative effects between nodes and edges; (2) integration of Shapley interaction values with multi-scale sampling and computational efficiency optimizations; and (3) state-of-the-art performance on Fidelity and Sparsity—demonstrating that causal interaction modeling is essential for enhancing explanation fidelity. Experimental results validate significant improvements over existing methods, confirming the effectiveness and necessity of modeling causal node-edge interdependencies in GNN explanation.

Technology Category

Application Category

📝 Abstract
Explainability is crucial for the application of black-box Graph Neural Networks (GNNs) in critical fields such as healthcare, finance, cybersecurity, and more. Various feature attribution methods, especially the perturbation-based methods, have been proposed to indicate how much each node/edge contributes to the model predictions. However, these methods fail to generate connected explanatory subgraphs that consider the causal interaction between edges within different coalition scales, which will result in unfaithful explanations. In our study, we propose GISExplainer, a novel game-theoretic interaction based explanation method that uncovers what the underlying GNNs have learned for node classification by discovering human-interpretable causal explanatory subgraphs. First, GISExplainer defines a causal attribution mechanism that considers the game-theoretic interaction of multi-granularity coalitions in candidate explanatory subgraph to quantify the causal effect of an edge on the prediction. Second, GISExplainer assumes that the coalitions with negative effects on the predictions are also significant for model interpretation, and the contribution of the computation graph stems from the combined influence of both positive and negative interactions within the coalitions. Then, GISExplainer regards the explanation task as a sequential decision process, in which a salient edges is successively selected and connected to the previously selected subgraph based on its causal effect to form an explanatory subgraph, ultimately striving for better explanations. Additionally, an efficiency optimization scheme is proposed for the causal attribution mechanism through coalition sampling. Extensive experiments demonstrate that GISExplainer achieves better performance than state-of-the-art approaches w.r.t. two quantitative metrics: Fidelity and Sparsity.
Problem

Research questions and friction points this paper is trying to address.

Graph Neural Networks
Explainability
Prediction Accuracy
Innovation

Methods, ideas, or system contributions that make the work stand out.

GISExplainer
Game Theory
Graph Neural Networks Explanation
🔎 Similar Papers
No similar papers found.
X
Xingping Xian
School of Cyber Security and Information Law, Chongqing University of Posts and Telecommunications, Chongqing, China
J
Jianlu Liu
School of Cyber Security and Information Law, Chongqing University of Posts and Telecommunications, Chongqing, China
C
Chao Wang
School of Computer and Information Science, Chongqing Normal University, Chongqing, China
T
Tao Wu
School of Cyber Security and Information Law, Chongqing University of Posts and Telecommunications, Chongqing, China
S
Shaojie Qiao
School of Software Engineering, Chengdu University of Information Technology, Chengdu, China
X
Xiaochuan Tang
Chengdu University of Technology, Chengdu, China
Q
Qun Liu
School of Computer Science and Technology, Chongqing University of Posts and Telecommunications, Chongqing, China