Beyond Textual Context: Structural Graph Encoding with Adaptive Space Alignment to alleviate the hallucination of LLMs

πŸ“… 2025-09-26
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
Existing approaches integrating knowledge graphs (KGs) with large language models (LLMs) rely solely on semantic information while neglecting KG structural topology, and suffer from embedding space misalignment between KG encoders and LLMsβ€”leading to frequent hallucinations. To address this, we propose a structure-semantic co-enhancement architecture that jointly models KG topology and semantics for the first time. Our method employs graph neural networks for structural encoding and introduces an adaptive cross-modal alignment module to dynamically calibrate KG and LLM embedding spaces. It comprises three core components: KG retrieval (KGR), structure-aware KG encoding (KGE), and space-adaptive KG adaptation (KGA). Extensive experiments on multiple factual reasoning benchmarks demonstrate significant improvements in answer accuracy and robustness, with hallucination rates reduced by 23.6% on average. These results validate the effectiveness of explicit structural modeling and embedding-space alignment in KG-LLM integration.

Technology Category

Application Category

πŸ“ Abstract
Currently, the main approach for Large Language Models (LLMs) to tackle the hallucination issue is incorporating Knowledge Graphs(KGs).However, LLMs typically treat KGs as plain text, extracting only semantic information and limiting their use of the crucial structural aspects of KGs. Another challenge is the gap between the embedding spaces of KGs encoders and LLMs text embeddings, which hinders the effective integration of structured knowledge. To overcome these obstacles, we put forward the SSKG-LLM, an innovative model architecture that is designed to efficiently integrate both the Structural and Semantic information of KGs into the reasoning processes of LLMs. SSKG-LLM incorporates the Knowledge Graph Retrieval (KGR) module and the Knowledge Graph Encoding (KGE) module to preserve semantics while utilizing structure. Then, the Knowledge Graph Adaptation (KGA) module is incorporated to enable LLMs to understand KGs embeddings. We conduct extensive experiments and provide a detailed analysis to explore how incorporating the structural information of KGs can enhance the factual reasoning abilities of LLMs. Our code are available at https://github.com/yfangZhang/SSKG-LLM.
Problem

Research questions and friction points this paper is trying to address.

Integrating structural graph information into LLMs reasoning
Aligning embedding spaces between knowledge graphs and LLMs
Reducing LLM hallucinations through enhanced factual reasoning
Innovation

Methods, ideas, or system contributions that make the work stand out.

Integrates structural and semantic knowledge graph information
Aligns embedding spaces between graphs and language models
Uses specialized modules for retrieval, encoding, and adaptation
πŸ”Ž Similar Papers
No similar papers found.
Y
Yifang Zhang
Wuhan University of Technology
P
Pengfei Duan
Wuhan University of Technology
Y
Yiwen Yang
Wuhan University of Technology
Shengwu Xiong
Shengwu Xiong
Wuhan University of Technology
Artificial Intelligence