🤖 AI Summary
Temporal graph link prediction faces challenges arising from heterogeneous node interactions: a few node pairs dominate interactions, while inter-event time intervals exhibit high irregularity. This hinders existing models from effectively encoding temporal information and causes them to forget historical states of sparsely interacting node pairs. To address this, we propose TAMI—a plug-and-play framework for enhancing mainstream temporal GNNs. Its core innovations are: (1) a Logarithmic Time Encoding (LTE) function that mitigates bias induced by heavy-tailed time-interval distributions; and (2) a Link History Aggregation (LHA) mechanism that explicitly models and preserves long-range dependencies for infrequent interactions. TAMI significantly improves robustness to irregular temporal patterns. Evaluated on 13 classical datasets and 3 TGB benchmarks, TAMI consistently and substantially outperforms state-of-the-art baselines under both transductive and inductive settings—particularly boosting prediction accuracy for low-frequency interactions.
📝 Abstract
Temporal graph link prediction aims to predict future interactions between nodes in a graph based on their historical interactions, which are encoded in node embeddings. We observe that heterogeneity naturally appears in temporal interactions, e.g., a few node pairs can make most interaction events, and interaction events happen at varying intervals. This leads to the problems of ineffective temporal information encoding and forgetting of past interactions for a pair of nodes that interact intermittently for their link prediction. Existing methods, however, do not consider such heterogeneity in their learning process, and thus their learned temporal node embeddings are less effective, especially when predicting the links for infrequently interacting node pairs. To cope with the heterogeneity, we propose a novel framework called TAMI, which contains two effective components, namely log time encoding function (LTE) and link history aggregation (LHA). LTE better encodes the temporal information through transforming interaction intervals into more balanced ones, and LHA prevents the historical interactions for each target node pair from being forgotten. State-of-the-art temporal graph neural networks can be seamlessly and readily integrated into TAMI to improve their effectiveness. Experiment results on 13 classic datasets and three newest temporal graph benchmark (TGB) datasets show that TAMI consistently improves the link prediction performance of the underlying models in both transductive and inductive settings. Our code is available at https://github.com/Alleinx/TAMI_temporal_graph.