GNN's Uncertainty Quantification using Self-Distillation

📅 2025-06-24
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the inefficiency and inaccuracy of uncertainty quantification for Graph Neural Networks (GNNs) in clinical decision-making, this paper proposes an efficient, high-precision uncertainty estimation method based on self-distillation. Methodologically, a single GNN serves simultaneously as both teacher and student, eliminating the need for independent multi-model training via knowledge self-distillation; additionally, a weighted diversity-aware uncertainty metric is introduced to explicitly model internal model disagreement—overcoming the limited divergence characterization inherent in conventional ensemble approaches. The method integrates Monte Carlo (MC) Dropout with ensemble contrastive learning. Evaluations on MIMIC-IV and Enzymes graph benchmarks demonstrate that our approach achieves calibration performance comparable to MC Dropout and ensemble methods, yet with substantially reduced computational overhead. Moreover, it accurately identifies out-of-distribution samples, thereby enhancing reliability and trustworthiness in clinical deployment.

Technology Category

Application Category

📝 Abstract
Graph Neural Networks (GNNs) have shown remarkable performance in the healthcare domain. However, what remained challenging is quantifying the predictive uncertainty of GNNs, which is an important aspect of trustworthiness in clinical settings. While Bayesian and ensemble methods can be used to quantify uncertainty, they are computationally expensive. Additionally, the disagreement metric used by ensemble methods to compute uncertainty cannot capture the diversity of models in an ensemble network. In this paper, we propose a novel method, based on knowledge distillation, to quantify GNNs' uncertainty more efficiently and with higher precision. We apply self-distillation, where the same network serves as both the teacher and student models, thereby avoiding the need to train several networks independently. To ensure the impact of self-distillation, we develop an uncertainty metric that captures the diverse nature of the network by assigning different weights to each GNN classifier. We experimentally evaluate the precision, performance, and ability of our approach in distinguishing out-of-distribution data on two graph datasets: MIMIC-IV and Enzymes. The evaluation results demonstrate that the proposed method can effectively capture the predictive uncertainty of the model while having performance similar to that of the MC Dropout and ensemble methods. The code is publicly available at https://github.com/tailabTMU/UQ_GNN.
Problem

Research questions and friction points this paper is trying to address.

Quantify GNN predictive uncertainty efficiently
Improve uncertainty precision in clinical settings
Reduce computational cost of uncertainty methods
Innovation

Methods, ideas, or system contributions that make the work stand out.

Self-distillation for GNN uncertainty quantification
Weighted uncertainty metric for diverse classifiers
Efficient performance similar to MC Dropout
🔎 Similar Papers
No similar papers found.