🤖 AI Summary
To address high computational complexity and poor scalability in large-scale dynamic graph learning—caused by cumulative historical information—this paper proposes a time-aware lightweight dynamic graph neural network paradigm. Our method introduces three key innovations: (1) a parameter-free graph propagation preprocessing mechanism for efficient temporal topology reconstruction; (2) an exponential scalable temporal encoding scheme that explicitly captures long-range temporal dependencies; and (3) a hypernetwork-driven message aggregation module that adaptively fuses multi-hop historical node representations. Evaluated on node and link prediction tasks across 12 standard dynamic graph benchmarks, our approach achieves state-of-the-art or competitive performance while reducing average model parameters by 37% and accelerating inference speed by 2.1×. The framework thus achieves a favorable trade-off among accuracy, efficiency, and scalability.
📝 Abstract
Dynamic graphs (DGs), which capture time-evolving relationships between graph entities, have widespread real-world applications. To efficiently encode DGs for downstream tasks, most dynamic graph neural networks follow the traditional message-passing mechanism and extend it with time-based techniques. Despite their effectiveness, the growth of historical interactions introduces significant scalability issues, particularly in industry scenarios. To address this limitation, we propose ScaDyG, with the core idea of designing a time-aware scalable learning paradigm as follows: 1) Time-aware Topology Reformulation: ScaDyG first segments historical interactions into time steps (intra and inter) based on dynamic modeling, enabling weight-free and time-aware graph propagation within pre-processing. 2) Dynamic Temporal Encoding: To further achieve fine-grained graph propagation within time steps, ScaDyG integrates temporal encoding through a combination of exponential functions in a scalable manner. 3) Hypernetwork-driven Message Aggregation: After obtaining the propagated features (i.e., messages), ScaDyG utilizes hypernetwork to analyze historical dependencies, implementing node-wise representation by an adaptive temporal fusion. Extensive experiments on 12 datasets demonstrate that ScaDyG performs comparably well or even outperforms other SOTA methods in both node and link-level downstream tasks, with fewer learnable parameters and higher efficiency.