Improving Complex Reasoning over Knowledge Graph with Logic-Aware Curriculum Tuning

📅 2024-05-02
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Answering complex first-order logic (FOL) queries over incomplete knowledge graphs (KGs) remains challenging, as existing embedding-based methods struggle to explicitly model logical structures and leverage shared world knowledge. Method: We propose LACT, a Logic-Aware Curriculum Tuning framework, which introduces the first logic-structure-aware curriculum learning mechanism. LACT tightly integrates binary-tree query decomposition, quantitative assessment of logical complexity, and large language model (LLM)-based instruction fine-tuning. It employs staged training to enable symbolic-neural hybrid reasoning, overcoming the limitation of implicit logical knowledge modeling in conventional approaches. Contribution/Results: By explicitly encoding logical semantics and grounding them in pretrained LLMs, LACT achieves state-of-the-art performance on complex FOL query answering. On multiple standard benchmarks—including ComplexWebQuestions, MetaQA, and LC-QuAD 2.0—it improves mean reciprocal rank (MRR) by an average of 5.5% over prior methods, establishing new SOTA results.

Technology Category

Application Category

📝 Abstract
Answering complex queries over incomplete knowledge graphs (KGs) is a challenging job. Most previous works have focused on learning entity/relation embeddings and simulating first-order logic operators with various neural networks. However, they are bottlenecked by the inability to share world knowledge to improve logical reasoning, thus resulting in suboptimal performance. In this paper, we propose a complex reasoning schema over KG upon large language models (LLMs), containing a curriculum-based logical-aware instruction tuning framework, named LACT. Specifically, we augment the arbitrary first-order logical queries via binary tree decomposition, to stimulate the reasoning capability of LLMs. To address the difficulty gap among different types of complex queries, we design a simple and flexible logic-aware curriculum learning framework. Experiments across widely used datasets demonstrate that LACT has substantial improvements~(brings an average +5.5% MRR score) over advanced methods, achieving the new state-of-the-art.
Problem

Research questions and friction points this paper is trying to address.

Enhances complex query answering over incomplete knowledge graphs
Improves logical reasoning by integrating world knowledge with LLMs
Addresses difficulty gaps in complex queries via curriculum learning
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses large language models for KG reasoning
Introduces logic-aware curriculum tuning framework
Decomposes logical queries via binary trees
🔎 Similar Papers
No similar papers found.