Uncertainty-Aware Graph Neural Networks: A Multi-Hop Evidence Fusion Approach

๐Ÿ“… 2025-06-16
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
Existing graph neural networks (GNNs) overlook the uncertainty in class probabilities induced by varying model depth, leading to unreliable predictions. To address this, we propose Evidential Fusion GNN (EFGNN), the first GNN framework integrating Dempsterโ€“Shafer evidence theory into a multi-hop architecture to explicitly model prediction uncertainty across different receptive fields. We design a parameter-free cumulative belief fusion mechanism that adaptively aggregates evidential outputs from multiple layers. Furthermore, we introduce a joint optimization objective comprising evidential cross-entropy, a conflict regularization term, and a spurious-confidence penalty. Extensive experiments on benchmark datasets demonstrate that EFGNN significantly improves classification accuracy while enhancing predictive reliability and robustness. Notably, it explicitly identifies high-risk misclassifications. The implementation is publicly available.

Technology Category

Application Category

๐Ÿ“ Abstract
Graph neural networks (GNNs) excel in graph representation learning by integrating graph structure and node features. Existing GNNs, unfortunately, fail to account for the uncertainty of class probabilities that vary with the depth of the model, leading to unreliable and risky predictions in real-world scenarios. To bridge the gap, in this paper, we propose a novel Evidence Fusing Graph Neural Network (EFGNN for short) to achieve trustworthy prediction, enhance node classification accuracy, and make explicit the risk of wrong predictions. In particular, we integrate the evidence theory with multi-hop propagation-based GNN architecture to quantify the prediction uncertainty of each node with the consideration of multiple receptive fields. Moreover, a parameter-free cumulative belief fusion (CBF) mechanism is developed to leverage the changes in prediction uncertainty and fuse the evidence to improve the trustworthiness of the final prediction. To effectively optimize the EFGNN model, we carefully design a joint learning objective composed of evidence cross-entropy, dissonance coefficient, and false confident penalty. The experimental results on various datasets and theoretical analyses demonstrate the effectiveness of the proposed model in terms of accuracy and trustworthiness, as well as its robustness to potential attacks. The source code of EFGNN is available at https://github.com/Shiy-Li/EFGNN.
Problem

Research questions and friction points this paper is trying to address.

Quantify prediction uncertainty in graph neural networks
Improve node classification accuracy and trustworthiness
Fuse multi-hop evidence for reliable predictions
Innovation

Methods, ideas, or system contributions that make the work stand out.

Evidence Fusing GNN for trustworthy predictions
Multi-hop propagation with uncertainty quantification
Parameter-free cumulative belief fusion mechanism
๐Ÿ”Ž Similar Papers
No similar papers found.