๐ค AI Summary
This work addresses the high computational cost of existing methods for temporal knowledge graph reasoning, which hinders deployment in resource-constrained environments, and the inability of conventional knowledge distillation to preserve temporal dependencies. To overcome these limitations, the paper proposes a novel distillation framework that, for the first time, leverages a large language model as the teacher to transfer both structural and temporal-aware reasoning capabilities to a lightweight student model. By integrating general semantic knowledge with dynamic temporal information, the approach substantially enhances the studentโs capacity for temporal modeling. Experimental results across multiple benchmark datasets demonstrate that the proposed framework outperforms strong baselines, achieving a favorable balance among inference accuracy, computational efficiency, and practical deployability.
๐ Abstract
Reasoning over temporal knowledge graphs (TKGs) is fundamental to improving the efficiency and reliability of intelligent decision-making systems and has become a key technological foundation for future artificial intelligence applications. Despite recent progress, existing TKG reasoning models typically rely on large parameter sizes and intensive computation, leading to high hardware costs and energy consumption. These constraints hinder their deployment on resource-constrained, low-power, and distributed platforms that require real-time inference. Moreover, most existing model compression and distillation techniques are designed for static knowledge graphs and fail to adequately capture the temporal dependencies inherent in TKGs, often resulting in degraded reasoning performance. To address these challenges, we propose a distillation framework specifically tailored for temporal knowledge graph reasoning. Our approach leverages large language models as teacher models to guide the distillation process, enabling effective transfer of both structural and temporal reasoning capabilities to lightweight student models. By integrating large-scale public knowledge with task-specific temporal information, the proposed framework enhances the student model's ability to model temporal dynamics while maintaining a compact and efficient architecture. Extensive experiments on multiple publicly available benchmark datasets demonstrate that our method consistently outperforms strong baselines, achieving a favorable trade-off between reasoning accuracy, computational efficiency, and practical deployability.