🤖 AI Summary
Existing temporal graph neural networks (TGNNs) are limited to modeling pairwise interactions, neglecting higher-order structures (e.g., group-level dependencies), and suffer from poor memory efficiency—hindering both expressive power and scalability. To address these limitations, we propose HTGN—the first temporal graph neural network integrating hypergraph representation learning. HTGN introduces higher-order structural modeling into dynamic link prediction by constructing hyperedge representations via multi-feature edge aggregation and designing a temporal-aware hypergraph message-passing mechanism. We theoretically establish that HTGN possesses strictly greater expressive power than conventional TGNNs. Empirically, HTGN achieves significant improvements in link prediction accuracy across multiple real-world temporal graph datasets. Moreover, it reduces peak training memory consumption by up to 50%, effectively balancing high expressivity with computational efficiency.
📝 Abstract
Temporal Graph Neural Networks (TGNNs) have gained growing attention for modeling and predicting structures in temporal graphs. However, existing TGNNs primarily focus on pairwise interactions while overlooking higher-order structures that are integral to link formation and evolution in real-world temporal graphs. Meanwhile, these models often suffer from efficiency bottlenecks, further limiting their expressive power. To tackle these challenges, we propose a Higher-order structure Temporal Graph Neural Network, which incorporates hypergraph representations into temporal graph learning. In particular, we develop an algorithm to identify the underlying higher-order structures, enhancing the model's ability to capture the group interactions. Furthermore, by aggregating multiple edge features into hyperedge representations, HTGN effectively reduces memory cost during training. We theoretically demonstrate the enhanced expressiveness of our approach and validate its effectiveness and efficiency through extensive experiments on various real-world temporal graphs. Experimental results show that HTGN achieves superior performance on dynamic link prediction while reducing memory costs by up to 50% compared to existing methods.