NeSTR: A Neuro-Symbolic Abductive Framework for Temporal Reasoning in Large Language Models

📅 2025-12-08
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Large language models (LLMs) frequently exhibit temporal misinterpretation and logical inconsistency in complex, time-constrained sequential reasoning. To address this, we propose NeSTR, a neuro-symbolic inductive framework featuring three synergistic mechanisms: (1) explicit symbolic encoding of temporal relations among events; (2) automated consistency verification grounded in first-order temporal logic; and (3) multi-step abductive reflection to detect and rectify erroneous inferences. NeSTR requires no model fine-tuning—enhancing temporal sensitivity solely through neuro-symbolic collaboration during inference. Evaluated on zero-shot temporal question-answering benchmarks—including TimeQA and TempReason—NeSTR achieves substantial improvements in accuracy (+12.7%) and logical consistency (+18.3%). It establishes the first paradigm for temporal reasoning that is symbolically interpretable, neurally scalable, and zero-shot applicable.

Technology Category

Application Category

📝 Abstract
Large Language Models (LLMs) have demonstrated remarkable performance across a wide range of natural language processing tasks. However, temporal reasoning, particularly under complex temporal constraints, remains a major challenge. To this end, existing approaches have explored symbolic methods, which encode temporal structure explicitly, and reflective mechanisms, which revise reasoning errors through multi-step inference. Nonetheless, symbolic approaches often underutilize the reasoning capabilities of LLMs, while reflective methods typically lack structured temporal representations, which can result in inconsistent or hallucinated reasoning. As a result, even when the correct temporal context is available, LLMs may still misinterpret or misapply time-related information, leading to incomplete or inaccurate answers. To address these limitations, in this work, we propose Neuro-Symbolic Temporal Reasoning (NeSTR), a novel framework that integrates structured symbolic representations with hybrid reflective reasoning to enhance the temporal sensitivity of LLM inference. NeSTR preserves explicit temporal relations through symbolic encoding, enforces logical consistency via verification, and corrects flawed inferences using abductive reflection. Extensive experiments on diverse temporal question answering benchmarks demonstrate that NeSTR achieves superior zero-shot performance and consistently improves temporal reasoning without any fine-tuning, showcasing the advantage of neuro-symbolic integration in enhancing temporal understanding in large language models.
Problem

Research questions and friction points this paper is trying to address.

Enhances LLMs' temporal reasoning under complex constraints
Integrates symbolic representations with reflective reasoning for accuracy
Corrects flawed inferences using abductive reflection without fine-tuning
Innovation

Methods, ideas, or system contributions that make the work stand out.

Integrates symbolic representations with reflective reasoning
Preserves explicit temporal relations via symbolic encoding
Corrects flawed inferences using abductive reflection
🔎 Similar Papers
No similar papers found.