Efficient Learning-based Graph Simulation for Temporal Graphs

📅 2025-10-07
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing temporal graph generation methods suffer from inefficient training, slow inference, and difficulty in jointly preserving both structural and temporal characteristics. To address these challenges, we propose the Temporal Graph Autoencoder (TGAE), an efficient generative framework that integrates a global graph encoder with attention-based spatial-temporal dependency modeling and a local ego-graph–sampling decoder for lightweight, scalable edge generation. During encoding, TGAE captures holistic spatiotemporal patterns; during decoding, it enables fast, localized reconstruction. Compared to state-of-the-art learning-based generators, TGAE achieves superior fidelity in modeling structural evolution and temporal dynamics while significantly improving training speed and inference efficiency. Extensive experiments on multiple real-world and synthetic temporal graph datasets demonstrate that TGAE attains state-of-the-art performance in both generation quality—measured by AUC and MSE—and computational efficiency—measured by training and generation time.

Technology Category

Application Category

📝 Abstract
Graph simulation has recently received a surge of attention in graph processing and analytics. In real-life applications, e.g. social science, biology, and chemistry, many graphs are composed of a series of evolving graphs (i.e., temporal graphs). While most of the existing graph generators focus on static graphs, the temporal information of the graphs is ignored. In this paper, we focus on simulating temporal graphs, which aim to reproduce the structural and temporal properties of the observed real-life temporal graphs. In this paper, we first give an overview of the existing temporal graph generators, including recently emerged learning-based approaches. Most of these learning-based methods suffer from one of the limitations: low efficiency in training or slow generating, especially for temporal random walk-based methods. Therefore, we propose an efficient learning-based approach to generate graph snapshots, namely temporal graph autoencoder (TGAE). Specifically, we propose an attention-based graph encoder to encode temporal and structural characteristics on sampled ego-graphs. And we proposed an ego-graph decoder that can achieve a good trade-off between simulation quality and efficiency in temporal graph generation. Finally, the experimental evaluation is conducted among our proposed TGAE and representative temporal graph generators on real-life temporal graphs and synthesized graphs. It is reported that our proposed approach outperforms the state-of-the-art temporal graph generators by means of simulation quality and efficiency.
Problem

Research questions and friction points this paper is trying to address.

Simulating temporal graphs with structural and temporal properties
Addressing low efficiency in training and slow generation
Improving simulation quality and efficiency of graph generators
Innovation

Methods, ideas, or system contributions that make the work stand out.

Learning-based temporal graph autoencoder for simulation
Attention-based encoder captures temporal structural features
Ego-graph decoder balances quality and generation efficiency
🔎 Similar Papers
Sheng Xiang
Sheng Xiang
Tongji University
Graph SimulationGenerative Model
Chenhao Xu
Chenhao Xu
Victoria University
Deep LearningEdge ComputingBlockchain
Dawei Cheng
Dawei Cheng
Tongji University
Data MiningGraph LearningDeep LearningBig Data in Finance
X
Xiaoyang Wang
School of Computer Science and Engineering, University of New South Wales, Sydney, Australia
Y
Ying Zhang
Australia Artificial Intelligence Institute, University of Technology Sydney, Sydney, Australia