Higher-order Structure Boosts Link Prediction on Temporal Graphs

📅 2025-05-21
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing temporal graph neural networks (TGNNs) are limited to modeling pairwise interactions, neglecting higher-order structures (e.g., group-level dependencies), and suffer from poor memory efficiency—hindering both expressive power and scalability. To address these limitations, we propose HTGN—the first temporal graph neural network integrating hypergraph representation learning. HTGN introduces higher-order structural modeling into dynamic link prediction by constructing hyperedge representations via multi-feature edge aggregation and designing a temporal-aware hypergraph message-passing mechanism. We theoretically establish that HTGN possesses strictly greater expressive power than conventional TGNNs. Empirically, HTGN achieves significant improvements in link prediction accuracy across multiple real-world temporal graph datasets. Moreover, it reduces peak training memory consumption by up to 50%, effectively balancing high expressivity with computational efficiency.

Technology Category

Application Category

📝 Abstract
Temporal Graph Neural Networks (TGNNs) have gained growing attention for modeling and predicting structures in temporal graphs. However, existing TGNNs primarily focus on pairwise interactions while overlooking higher-order structures that are integral to link formation and evolution in real-world temporal graphs. Meanwhile, these models often suffer from efficiency bottlenecks, further limiting their expressive power. To tackle these challenges, we propose a Higher-order structure Temporal Graph Neural Network, which incorporates hypergraph representations into temporal graph learning. In particular, we develop an algorithm to identify the underlying higher-order structures, enhancing the model's ability to capture the group interactions. Furthermore, by aggregating multiple edge features into hyperedge representations, HTGN effectively reduces memory cost during training. We theoretically demonstrate the enhanced expressiveness of our approach and validate its effectiveness and efficiency through extensive experiments on various real-world temporal graphs. Experimental results show that HTGN achieves superior performance on dynamic link prediction while reducing memory costs by up to 50% compared to existing methods.
Problem

Research questions and friction points this paper is trying to address.

Existing TGNNs overlook higher-order structures in temporal graphs
Current models face efficiency bottlenecks limiting expressive power
Need to enhance link prediction while reducing memory costs
Innovation

Methods, ideas, or system contributions that make the work stand out.

Incorporates hypergraph representations into temporal learning
Identifies higher-order structures to capture group interactions
Reduces memory cost by aggregating edge features
🔎 Similar Papers
No similar papers found.