ChronoSpike: An Adaptive Spiking Graph Neural Network for Dynamic Graphs

📅 2026-02-01
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge in dynamic graph representation learning of simultaneously modeling structural relationships and temporal evolution, where existing methods often suffer from limited computational efficiency, gradient instability, or inadequate global context capture. To this end, we propose ChronoSpike, the first framework to integrate adaptive spiking neural networks into dynamic graph learning. ChronoSpike combines learnable leaky integrate-and-fire (LIF) neurons, channel-wise membrane potential dynamics, multi-head attention for spatial aggregation, and a lightweight Transformer-based temporal encoder. This design enables fine-grained local modeling and long-range dependency capture while maintaining linear memory complexity and a fixed parameter count of only 105K. Experiments on three large-scale benchmarks show that ChronoSpike improves Macro-F1 and Micro-F1 by 2.0% and 2.4%, respectively, trains 3–10× faster than recurrent approaches, and achieves 83–88% activation sparsity with theoretical stability guarantees.

Technology Category

Application Category

📝 Abstract
Dynamic graph representation learning requires capturing both structural relationships and temporal evolution, yet existing approaches face a fundamental trade-off: attention-based methods achieve expressiveness at $O(T^2)$ complexity, while recurrent architectures suffer from gradient pathologies and dense state storage. Spiking neural networks offer event-driven efficiency but remain limited by sequential propagation, binary information loss, and local aggregation that misses global context. We propose ChronoSpike, an adaptive spiking graph neural network that integrates learnable LIF neurons with per-channel membrane dynamics, multi-head attentive spatial aggregation on continuous features, and a lightweight Transformer temporal encoder, enabling both fine-grained local modeling and long-range dependency capture with linear memory complexity $O(T \cdot d)$. On three large-scale benchmarks, ChronoSpike outperforms twelve state-of-the-art baselines by $2.0\%$ Macro-F1 and $2.4\%$ Micro-F1 while achieving $3-10\times$ faster training than recurrent methods with a constant 105K-parameter budget independent of graph size. We provide theoretical guarantees for membrane potential boundedness, gradient flow stability under contraction factor $\rho<1$, and BIBO stability; interpretability analyses reveal heterogeneous temporal receptive fields and a learned primacy effect with $83-88\%$ sparsity.
Problem

Research questions and friction points this paper is trying to address.

dynamic graph representation learning
spiking neural networks
temporal evolution
graph neural networks
computational complexity
Innovation

Methods, ideas, or system contributions that make the work stand out.

Spiking Neural Networks
Dynamic Graph Representation Learning
Adaptive LIF Neurons
Linear Memory Complexity
Temporal Transformer Encoder
🔎 Similar Papers
No similar papers found.