Temporal Graph Pattern Machine

📅 2026-01-30
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing dynamic graph modeling approaches are limited by short-term dependencies, static neighborhood semantics, and retrospective temporal assumptions, hindering their ability to capture transferable evolutionary patterns. This work proposes the Temporal Graph Pattern Machine (TGPM), a novel foundation model framework for temporal graphs centered on universal evolutionary patterns, thereby departing from conventional task-specific paradigms. TGPM employs time-biased random walks to generate interaction blocks and leverages a Transformer backbone to learn multi-scale structural semantics and long-range dependencies. It further introduces self-supervised objectives—including masked token modeling and next-time prediction—to uncover underlying evolutionary dynamics. The method achieves state-of-the-art performance on both transductive and inductive link prediction tasks and demonstrates significantly enhanced cross-domain transferability.

Technology Category

Application Category

📝 Abstract
Temporal graph learning is pivotal for deciphering dynamic systems, where the core challenge lies in explicitly modeling the underlying evolving patterns that govern network transformation. However, prevailing methods are predominantly task-centric and rely on restrictive assumptions -- such as short-term dependency modeling, static neighborhood semantics, and retrospective time usage. These constraints hinder the discovery of transferable temporal evolution mechanisms. To address this, we propose the Temporal Graph Pattern Machine (TGPM), a foundation framework that shifts the focus toward directly learning generalized evolving patterns. TGPM conceptualizes each interaction as an interaction patch synthesized via temporally-biased random walks, thereby capturing multi-scale structural semantics and long-range dependencies that extend beyond immediate neighborhoods. These patches are processed by a Transformer-based backbone designed to capture global temporal regularities while adapting to context-specific interaction dynamics. To further empower the model, we introduce a suite of self-supervised pre-training tasks -- specifically masked token modeling and next-time prediction -- to explicitly encode the fundamental laws of network evolution. Extensive experiments show that TGPM consistently achieves state-of-the-art performance in both transductive and inductive link prediction, demonstrating exceptional cross-domain transferability.
Problem

Research questions and friction points this paper is trying to address.

Temporal Graph Learning
Evolving Patterns
Network Evolution
Transferable Mechanisms
Dynamic Systems
Innovation

Methods, ideas, or system contributions that make the work stand out.

Temporal Graph Learning
Interaction Patch
Time-biased Random Walk
Transformer-based Backbone
Self-supervised Pre-training
🔎 Similar Papers
No similar papers found.
Y
Yijun Ma
University of Notre Dame
Zehong Wang
Zehong Wang
University of Notre Dame
Machine LearningFoundation ModelGraph Learning
W
Weixiang Sun
University of Notre Dame
Y
Yanfang Ye
University of Notre Dame