Enhancing Knowledge Graph Completion with GNN Distillation and Probabilistic Interaction Modeling

πŸ“… 2025-05-18
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
Knowledge graphs (KGs) suffer from pervasive link incompleteness, hindering downstream applications. Existing knowledge graph completion (KGC) methods face two key bottlenecks: deep graph neural networks (GNNs) are prone to over-smoothing, while embedding-based models struggle to capture abstract, high-order relational patterns. To address these challenges, we propose APIM-Distillβ€”a unified framework that jointly integrates GNN distillation with Abstract Probabilistic Interaction Modeling (APIM). Specifically, we design an iterative message-feature filtering mechanism to mitigate over-smoothing, and introduce an APIM module grounded in probabilistic signatures and transition matrices to enable interpretable, structured modeling of high-order relations. Extensive experiments on WN18RR and FB15K-237 demonstrate that APIM-Distill achieves state-of-the-art performance, improving Mean Reciprocal Rank (MRR) by up to 4.2% over prior methods. The source code is publicly available.

Technology Category

Application Category

πŸ“ Abstract
Knowledge graphs (KGs) serve as fundamental structures for organizing interconnected data across diverse domains. However, most KGs remain incomplete, limiting their effectiveness in downstream applications. Knowledge graph completion (KGC) aims to address this issue by inferring missing links, but existing methods face critical challenges: deep graph neural networks (GNNs) suffer from over-smoothing, while embedding-based models fail to capture abstract relational features. This study aims to overcome these limitations by proposing a unified framework that integrates GNN distillation and abstract probabilistic interaction modeling (APIM). GNN distillation approach introduces an iterative message-feature filtering process to mitigate over-smoothing, preserving the discriminative power of node representations. APIM module complements this by learning structured, abstract interaction patterns through probabilistic signatures and transition matrices, allowing for a richer, more flexible representation of entity and relation interactions. We apply these methods to GNN-based models and the APIM to embedding-based KGC models, conducting extensive evaluations on the widely used WN18RR and FB15K-237 datasets. Our results demonstrate significant performance gains over baseline models, showcasing the effectiveness of the proposed techniques. The findings highlight the importance of both controlling information propagation and leveraging structured probabilistic modeling, offering new avenues for advancing knowledge graph completion. And our codes are available at https://anonymous.4open.science/r/APIM_and_GNN-Distillation-461C.
Problem

Research questions and friction points this paper is trying to address.

Overcoming over-smoothing in deep GNNs for KGC
Capturing abstract relational features in embedding-based KGC
Integrating GNN distillation and probabilistic modeling for KGC
Innovation

Methods, ideas, or system contributions that make the work stand out.

GNN distillation mitigates over-smoothing via iterative message-feature filtering
Probabilistic interaction modeling captures abstract relational patterns
Unified framework combines GNN distillation and probabilistic modeling
πŸ”Ž Similar Papers
No similar papers found.
Lingzhi Wang
Lingzhi Wang
Associate Professor, Harbin Institute of Technology, Shenzhen
Artificial IntelligenceInformation SecurityNLPSocial Media Analysis
Pengcheng Huang
Pengcheng Huang
Computer Engineering Group, ETH Zurich
Intelligent Learning SystemsCyber Physical Systems
H
Haotian Li
Shandong Key Laboratory of Industrial Network Security, School of Computer Science and Technology, Harbin Institute of Technology, Weihai 264209
Y
Yuliang Wei
Shandong Key Laboratory of Industrial Network Security, School of Computer Science and Technology, Harbin Institute of Technology, Weihai 264209
G
Guodong Xin
Shandong Key Laboratory of Industrial Network Security, School of Computer Science and Technology, Harbin Institute of Technology, Weihai 264209
R
Rui Zhang
Shandong Key Laboratory of Industrial Network Security, School of Computer Science and Technology, Harbin Institute of Technology, Weihai 264209
D
Donglin Zhang
Shandong Key Laboratory of Industrial Network Security, School of Computer Science and Technology, Harbin Institute of Technology, Weihai 264209
Z
Zhenzhou Ji
School of Computer Science and Technology, Harbin Institute of Technology, Weihai 264209
W
Wei Wang
Shandong Key Laboratory of Industrial Network Security, School of Computer Science and Technology, Harbin Institute of Technology, Weihai 264209