Comprehending Knowledge Graphs with Large Language Models for Recommender Systems

📅 2024-10-16
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing knowledge graphs (KGs) for recommendation face three key challenges: factual incompleteness, semantic loss due to ID-based representation, and difficulty in modeling high-order global relational structures. To address these, we propose CoLaKG—a novel large language model (LLaMA-2)-driven dual-granularity KG understanding framework. It employs subgraph centralization for precise local modeling and integrates retrieval-augmented generation (RAG) with global semantic retrieval to capture long-range dependencies. We design a representation fusion and retrieval-augmented learning module that jointly models KG entity IDs and natural-language semantics—first of its kind—and incorporate contrastive learning for representation optimization. Extensive experiments on four real-world datasets demonstrate state-of-the-art performance, achieving up to 18.7% improvement in Recall@20 over prior methods. Results validate the effectiveness of semantic completion and high-order global relational modeling in KG-enhanced recommendation.

Technology Category

Application Category

📝 Abstract
In recent years, the introduction of knowledge graphs (KGs) has significantly advanced recommender systems by facilitating the discovery of potential associations between items. However, existing methods still face several limitations. First, most KGs suffer from missing facts or limited scopes. Second, existing methods convert textual information in KGs into IDs, resulting in the loss of natural semantic connections between different items. Third, existing methods struggle to capture high-order connections in the global KG. To address these limitations, we propose a novel method called CoLaKG, which leverages large language models (LLMs) to improve KG-based recommendations. The extensive world knowledge and remarkable reasoning capabilities of LLMs enable our method to supplement missing facts in KGs. Additionally, their powerful text understanding abilities allow for better utilization of semantic information. Specifically, CoLaKG extracts useful information from the KG at both local and global levels. By employing item-centered subgraph extraction and prompt engineering, it accurately captures the local KG. Subsequently, through retrieval-based neighbor enhancement, it supplements the current item by capturing related items from the entire KG, thereby effectively utilizing global information. The local and global information extracted by the LLM are effectively integrated into the recommendation model through a representation fusion module and a retrieval-augmented representation learning module, respectively, thereby improving recommendation performance. Extensive experiments on four real-world datasets demonstrate the superiority of our method.
Problem

Research questions and friction points this paper is trying to address.

Enhance KG-based recommender systems
Address missing facts in KGs
Capture high-order global KG connections
Innovation

Methods, ideas, or system contributions that make the work stand out.

Leverages large language models
Enhances knowledge graph completeness
Integrates local and global information