Redundancy-Aware Test-Time Graph Out-of-Distribution Detection

πŸ“… 2025-10-16
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
In graph classification, structural redundancy induces semantic shift, severely degrading out-of-distribution (OOD) detection performance. To address this, we propose RedOUTβ€”the first unsupervised, redundancy-aware graph OOD detection framework. Its core is the Redundancy-aware Graph Information Bottleneck (ReGIB), which theoretically models and minimizes redundant information via structural entropy, deriving tight optimization bounds and achieving a principled balance between redundancy disentanglement and semantic preservation at test time. RedOUT integrates structural entropy analysis, the information bottleneck principle, and graph neural network representation learning, augmented by a test-time adaptive optimization mechanism. Evaluated on multiple real-world benchmarks, RedOUT consistently outperforms state-of-the-art methods by an average of 6.7%, with gains up to 17.3% on the ClinTox/LIPO dataset pair, significantly enhancing robustness in graph OOD detection.

Technology Category

Application Category

πŸ“ Abstract
Distributional discrepancy between training and test data can lead models to make inaccurate predictions when encountering out-of-distribution (OOD) samples in real-world applications. Although existing graph OOD detection methods leverage data-centric techniques to extract effective representations, their performance remains compromised by structural redundancy that induces semantic shifts. To address this dilemma, we propose RedOUT, an unsupervised framework that integrates structural entropy into test-time OOD detection for graph classification. Concretely, we introduce the Redundancy-aware Graph Information Bottleneck (ReGIB) and decompose the objective into essential information and irrelevant redundancy. By minimizing structural entropy, the decoupled redundancy is reduced, and theoretically grounded upper and lower bounds are proposed for optimization. Extensive experiments on real-world datasets demonstrate the superior performance of RedOUT on OOD detection. Specifically, our method achieves an average improvement of 6.7%, significantly surpassing the best competitor by 17.3% on the ClinTox/LIPO dataset pair.
Problem

Research questions and friction points this paper is trying to address.

Detects graph out-of-distribution samples at test time
Reduces structural redundancy causing semantic shifts in graphs
Improves OOD detection via redundancy-aware information bottleneck optimization
Innovation

Methods, ideas, or system contributions that make the work stand out.

Integrates structural entropy for graph OOD detection
Decomposes objective into essential information and redundancy
Minimizes structural entropy to reduce irrelevant redundancy
πŸ”Ž Similar Papers
No similar papers found.
Y
Yue Hou
State Key Laboratory of Complex & Critical Software Environment, Beihang University
H
He Zhu
State Key Laboratory of Complex & Critical Software Environment, Beihang University
R
Ruomei Liu
State Key Laboratory of Complex & Critical Software Environment, Beihang University
Y
Yingke Su
Shen Yuan Honors College, Beihang University
Junran Wu
Junran Wu
National University of Singapore
Graph Neural NetworksNatural Language ProcessingTime Series
K
Ke Xu
State Key Laboratory of Complex & Critical Software Environment, Beihang University