π€ AI Summary
Existing approaches integrating knowledge graphs (KGs) with large language models (LLMs) rely solely on semantic information while neglecting KG structural topology, and suffer from embedding space misalignment between KG encoders and LLMsβleading to frequent hallucinations. To address this, we propose a structure-semantic co-enhancement architecture that jointly models KG topology and semantics for the first time. Our method employs graph neural networks for structural encoding and introduces an adaptive cross-modal alignment module to dynamically calibrate KG and LLM embedding spaces. It comprises three core components: KG retrieval (KGR), structure-aware KG encoding (KGE), and space-adaptive KG adaptation (KGA). Extensive experiments on multiple factual reasoning benchmarks demonstrate significant improvements in answer accuracy and robustness, with hallucination rates reduced by 23.6% on average. These results validate the effectiveness of explicit structural modeling and embedding-space alignment in KG-LLM integration.
π Abstract
Currently, the main approach for Large Language Models (LLMs) to tackle the hallucination issue is incorporating Knowledge Graphs(KGs).However, LLMs typically treat KGs as plain text, extracting only semantic information and limiting their use of the crucial structural aspects of KGs. Another challenge is the gap between the embedding spaces of KGs encoders and LLMs text embeddings, which hinders the effective integration of structured knowledge. To overcome these obstacles, we put forward the SSKG-LLM, an innovative model architecture that is designed to efficiently integrate both the Structural and Semantic information of KGs into the reasoning processes of LLMs. SSKG-LLM incorporates the Knowledge Graph Retrieval (KGR) module and the Knowledge Graph Encoding (KGE) module to preserve semantics while utilizing structure. Then, the Knowledge Graph Adaptation (KGA) module is incorporated to enable LLMs to understand KGs embeddings. We conduct extensive experiments and provide a detailed analysis to explore how incorporating the structural information of KGs can enhance the factual reasoning abilities of LLMs. Our code are available at https://github.com/yfangZhang/SSKG-LLM.