ROG: Retrieval-Augmented LLM Reasoning for Complex First-Order Queries over Knowledge Graphs

📅 2026-02-02
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Answering complex first-order logical queries involving projection, intersection, union, and negation over incomplete knowledge graphs is highly challenging. This work proposes ROG, a novel framework that integrates query-aware neighborhood retrieval with chain-of-thought reasoning from large language models. ROG recursively decomposes multi-operator queries into single-operator subqueries and leverages compact, relevant neighborhood evidence at each reasoning step. By introducing a retrieval-augmented, stepwise reasoning mechanism along with caching and reuse of intermediate answer sets, ROG significantly enhances reasoning consistency and robustness—particularly for high-complexity and negation-containing queries. Experimental results demonstrate that ROG consistently outperforms strong embedding-based baselines across standard benchmarks.

Technology Category

Application Category

📝 Abstract
Answering first-order logic (FOL) queries over incomplete knowledge graphs (KGs) is difficult, especially for complex query structures that compose projection, intersection, union, and negation. We propose ROG, a retrieval-augmented framework that combines query-aware neighborhood retrieval with large language model (LLM) chain-of-thought reasoning. ROG decomposes a multi-operator query into a sequence of single-operator sub-queries and grounds each step in compact, query-relevant neighborhood evidence. Intermediate answer sets are cached and reused across steps, improving consistency on deep reasoning chains. This design reduces compounding errors and yields more robust inference on complex and negation-heavy queries. Overall, ROG provides a practical alternative to embedding-based logical reasoning by replacing learned operators with retrieval-grounded, step-wise inference. Experiments on standard KG reasoning benchmarks show consistent gains over strong embedding-based baselines, with the largest improvements on high-complexity and negation-heavy query types.
Problem

Research questions and friction points this paper is trying to address.

knowledge graphs
first-order logic queries
complex query reasoning
incomplete KGs
negation
Innovation

Methods, ideas, or system contributions that make the work stand out.

retrieval-augmented reasoning
large language model
knowledge graph querying
first-order logic
chain-of-thought reasoning
🔎 Similar Papers
No similar papers found.
Z
Ziyan Zhang
School of Information Science and Engineering, Chongqing Jiaotong University
C
Chao Wang
State Grid Chongqing Electric Power Company
Z
Zhuo Chen
State Grid Chongqing Electric Power Company
C
Chiyi Li
State Grid Chongqing Electric Power Company
Kai Song
Kai Song
TikTok Inc.
NLP & LLM