TGM: a Modular and Efficient Library for Machine Learning on Temporal Graphs

📅 2025-10-08
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Temporal graph machine learning lacks a unified, flexible open-source framework; existing tools are largely confined to either continuous-time dynamic graphs (CTDGs) or discrete-time dynamic graphs (DTDGs), and struggle to support dynamic feature evolution and multi-granularity temporal modeling. Method: We introduce the first open-source library unifying CTDG and DTDG paradigms, featuring a modular architecture, efficient temporal indexing, and native support for dynamic feature updates—enabling link-, node-, and graph-level prediction tasks under time-driven training. Contribution/Results: The library provides a unified interface for both paradigms and enables cross-paradigm model integration, facilitating emerging directions such as dynamic graph attribute prediction. Experiments show an average 7.8× speedup in training over DyGLib and up to 175× acceleration in graph discretization, significantly improving research efficiency and scalability.

Technology Category

Application Category

📝 Abstract
Well-designed open-source software drives progress in Machine Learning (ML) research. While static graph ML enjoys mature frameworks like PyTorch Geometric and DGL, ML for temporal graphs (TG), networks that evolve over time, lacks comparable infrastructure. Existing TG libraries are often tailored to specific architectures, hindering support for diverse models in this rapidly evolving field. Additionally, the divide between continuous- and discrete-time dynamic graph methods (CTDG and DTDG) limits direct comparisons and idea transfer. To address these gaps, we introduce Temporal Graph Modelling (TGM), a research-oriented library for ML on temporal graphs, the first to unify CTDG and DTDG approaches. TGM offers first-class support for dynamic node features, time-granularity conversions, and native handling of link-, node-, and graph-level tasks. Empirically, TGM achieves an average 7.8x speedup across multiple models, datasets, and tasks compared to the widely used DyGLib, and an average 175x speedup on graph discretization relative to available implementations. Beyond efficiency, we show in our experiments how TGM unlocks entirely new research possibilities by enabling dynamic graph property prediction and time-driven training paradigms, opening the door to questions previously impractical to study. TGM is available at https://github.com/tgm-team/tgm
Problem

Research questions and friction points this paper is trying to address.

Addresses lack of unified infrastructure for temporal graph machine learning
Bridges the divide between continuous and discrete-time dynamic graph methods
Enables efficient and diverse model support for evolving network research
Innovation

Methods, ideas, or system contributions that make the work stand out.

Unifies continuous and discrete temporal graph approaches
Supports dynamic node features and multi-level tasks
Achieves significant speedup over existing temporal graph libraries
🔎 Similar Papers
No similar papers found.