🤖 AI Summary
To address the strong supervision dependency and poor generalization of large language models (LLMs) in temporal knowledge graph (TKG) completion under historical fact sparsity, this paper proposes a rule-driven, three-stage collaborative framework: (1) multi-hop retrieval grounded in logical rules to enhance evidence acquisition in low-resource settings; (2) lightweight adapter-based contrastive fine-tuning to improve robustness in temporal semantic modeling; and (3) embedding-similarity-guided semantic filtering at test time to suppress hallucination. Evaluated on four TKG benchmarks, our method achieves up to a 30.6% improvement in Hits@10, significantly enhancing semantic consistency of predictions and generalization performance on historically sparse samples. The framework establishes a novel, interpretable, and highly robust paradigm for few-shot temporal reasoning.
📝 Abstract
Temporal Knowledge Graphs (TKGs) represent dynamic facts as timestamped relations between entities. TKG completion involves forecasting missing or future links, requiring models to reason over time-evolving structure. While LLMs show promise for this task, existing approaches often overemphasize supervised fine-tuning and struggle particularly when historical evidence is limited or missing. We introduce RECIPE-TKG, a lightweight and data-efficient framework designed to improve accuracy and generalization in settings with sparse historical context. It combines (1) rule-based multi-hop retrieval for structurally diverse history, (2) contrastive fine-tuning of lightweight adapters to encode relational semantics, and (3) test-time semantic filtering to iteratively refine generations based on embedding similarity. Experiments on four TKG benchmarks show that RECIPE-TKG outperforms previous LLM-based approaches, achieving up to 30.6% relative improvement in Hits@10. Moreover, our proposed framework produces more semantically coherent predictions, even for the samples with limited historical context.