🤖 AI Summary
This work addresses the challenge of effectively modeling temporal dependencies and structural information in sparse, continuously evolving dynamic networks, where existing approaches often yield suboptimal link prediction performance. To overcome this limitation, the authors propose a novel method that integrates Temporal Graph Networks (TGN) with the SEAL framework, explicitly incorporating local enclosing subgraphs into TGN for the first time. By jointly learning the local topology and temporal dynamics of candidate links, the model enables synergistic representation of structural and temporal features. Experimental results on sparse communication detail record (CDR) datasets demonstrate that the proposed approach improves average precision by 2.6% over the standard TGN, significantly enhancing both the robustness and accuracy of link prediction in dynamic graphs.
📝 Abstract
Predicting links in sparse, continuously evolving networks is a central challenge in network science. Conventional heuristic methods and deep learning models, including Graph Neural Networks (GNNs), are typically designed for static graphs and thus struggle to capture temporal dependencies. Snapshot-based techniques partially address this issue but often encounter data sparsity and class imbalance, particularly in networks with transient interactions such as telecommunication call detail records (CDRs). Temporal Graph Networks (TGNs) model dynamic graphs by updating node embeddings over time; however, their predictive accuracy under sparse conditions remains limited. In this study, we improve the TGN framework by extracting enclosing subgraphs around candidate links, enabling the model to jointly learn structural and temporal information. Experiments on a sparse CDR dataset show that our approach increases average precision by 2.6% over standard TGNs, demonstrating the advantages of integrating local topology for robust link prediction in dynamic networks.