🤖 AI Summary
Existing time-aware knowledge graph reasoning (TKGR) methods suffer from insufficient temporal context modeling for fact interpolation (historical completion) and generalization bias for extrapolation (future prediction). This paper proposes a unified framework: first, it constructs dynamic entity-centric subgraphs and employs a collaborative dual-branch GNN to enable fine-grained temporal contextual modeling; second, it introduces a conditional diffusion-based generative regularization mechanism that compels the model to learn intrinsic event evolution laws rather than spurious statistical patterns, thereby unifying interpolation and extrapolation in a principled manner. The approach integrates dynamic subgraph sampling, snapshot sequence modeling, and temporal embedding. Evaluated on six benchmark datasets, it achieves state-of-the-art performance—improving average MRR by 2.61 for interpolation and 1.45 for extrapolation—demonstrating significantly enhanced long-horizon temporal generalization and zero-shot event prediction capability.
📝 Abstract
Temporal Knowledge Graph Reasoning (TKGR) aims to complete missing factual elements along the timeline. Depending on the temporal position of the query, the task is categorized into interpolation and extrapolation. Existing interpolation methods typically embed temporal information into individual facts to complete missing historical knowledge, while extrapolation techniques often leverage sequence models over graph snapshots to identify recurring patterns for future event prediction. These methods face two critical challenges: limited contextual modeling in interpolation and cognitive generalization bias in extrapolation. To address these, we propose a unified method for TKGR, dubbed DynaGen. For interpolation, DynaGen dynamically constructs entity-centric subgraphs and processes them with a synergistic dual-branch GNN encoder to capture evolving structural context. For extrapolation, it applies a conditional diffusion process, which forces the model to learn underlying evolutionary principles rather than just superficial patterns, enhancing its ability to predict unseen future events. Extensive experiments on six benchmark datasets show DynaGen achieves state-of-the-art performance. On average, compared to the second-best models, DynaGen improves the Mean Reciprocal Rank (MRR) score by 2.61 points for interpolation and 1.45 points for extrapolation.