TIDFormer: Exploiting Temporal and Interactive Dynamics Makes A Great Dynamic Graph Transformer

📅 2025-05-31
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the challenge in dynamic graph modeling where conventional self-attention mechanisms struggle to jointly capture temporal dependencies, interaction evolution, and interpretability, this paper proposes the first interpretable dynamic graph self-attention mechanism. Methodologically: (1) calendar-based time partitioning encoding is introduced to explicitly model absolute temporal semantics; (2) a first-order interaction embedding coupled with neighborhood sampling is designed to jointly represent temporal and structural dynamics; (3) a lightweight historical pattern decomposition module is embedded to capture evolving interaction trends. Evaluated on multiple standard dynamic graph benchmarks, the method achieves comprehensive improvements over state-of-the-art approaches, with significant inference speedup. Crucially, it provides dual interpretability—both temporal and structural—through attention weight analysis, enabling transparent insight into how temporal patterns and graph topology jointly influence representation learning.

Technology Category

Application Category

📝 Abstract
Due to the proficiency of self-attention mechanisms (SAMs) in capturing dependencies in sequence modeling, several existing dynamic graph neural networks (DGNNs) utilize Transformer architectures with various encoding designs to capture sequential evolutions of dynamic graphs. However, the effectiveness and efficiency of these Transformer-based DGNNs vary significantly, highlighting the importance of properly defining the SAM on dynamic graphs and comprehensively encoding temporal and interactive dynamics without extra complex modules. In this work, we propose TIDFormer, a dynamic graph TransFormer that fully exploits Temporal and Interactive Dynamics in an efficient manner. We clarify and verify the interpretability of our proposed SAM, addressing the open problem of its uninterpretable definitions on dynamic graphs in previous works. To model the temporal and interactive dynamics, respectively, we utilize the calendar-based time partitioning information and extract informative interaction embeddings for both bipartite and non-bipartite graphs using merely the sampled first-order neighbors. In addition, we jointly model temporal and interactive features by capturing potential changes in historical interaction patterns through a simple decomposition. We conduct extensive experiments on several dynamic graph datasets to verify the effectiveness and efficiency of TIDFormer. The experimental results demonstrate that TIDFormer excels, outperforming state-of-the-art models across most datasets and experimental settings. Furthermore, TIDFormer exhibits significant efficiency advantages compared to previous Transformer-based methods.
Problem

Research questions and friction points this paper is trying to address.

Defining effective self-attention mechanisms for dynamic graphs
Encoding temporal and interactive dynamics efficiently
Improving interpretability of dynamic graph Transformers
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses calendar-based time partitioning
Extracts interaction embeddings efficiently
Models historical interaction patterns simply
🔎 Similar Papers
No similar papers found.