GS-KGC: A generative subgraph-based framework for knowledge graph completion with large language models

📅 2024-08-20
🏛️ Information Fusion
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Knowledge graph completion (KGC) faces the challenge of achieving high-accuracy link prediction using only local subgraph information, without relying on global graph structure or extensive labeled data. Method: This paper proposes a generative subgraph modeling framework that introduces, for the first time, a subgraph-level prompt generation mechanism—integrating structural-aware encoding with semantic generation from large language models (LLMs) without fine-tuning. The approach supports zero-shot and few-shot KGC. Contributions/Results: Key innovations include structural-aware subgraph sampling and encoding, joint structure-text representation learning, and generative triple decoding. Evaluated on standard benchmarks (FB15k-237, WN18RR), the method significantly outperforms conventional embedding-based models and LLM fine-tuning approaches, achieves a 3× speedup in inference, and demonstrates strong generalization across unseen relations and sparse entities.

Technology Category

Application Category

Problem

Research questions and friction points this paper is trying to address.

Large Language Models
Knowledge Graph Completion
Limited Data Utilization
Innovation

Methods, ideas, or system contributions that make the work stand out.

Generative Subgraph Strategy
Question Answering Approach
Large Language Models Integration
🔎 Similar Papers
No similar papers found.
R
Rui Yang
Zhongshan School of Medicine, Sun Yat-sen University
Jiahao Zhu
Jiahao Zhu
Sun Yat-sen University
Diffusion modelAI Security
J
Jianping Man
Zhongshan School of Medicine, Sun Yat-sen University
H
Hongze Liu
Zhongshan School of Medicine, Sun Yat-sen University
L
Li Fang
Zhongshan School of Medicine, Sun Yat-sen University
Y
Yi Zhou
Zhongshan School of Medicine, Sun Yat-sen University