Graph Variate Neural Networks

📅 2025-09-24
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Modeling dynamic spatiotemporal signals is challenged by the absence or irrelevance of underlying graph structures. Method: We propose Graph-Varying Neural Networks (GVNNs), which introduce a signal-dependent dynamic connectivity tensor to jointly model long-term stable topology and instantaneous, data-driven functional connectivity; further, we design a sliding-window-free dynamic convolution mechanism enabling linear-time estimation of functional networks at each time step. GVNNs unify graph-varying signal analysis and graph signal processing, integrating stable support graphs, dynamic convolution, and a synergistic GNN–sequence modeling architecture. Results: On multivariate time-series forecasting benchmarks, GVNNs significantly outperform state-of-the-art graph-based models and match the performance of LSTM and Transformer baselines. In EEG-based motor imagery classification—a canonical dynamic functional connectivity task—GVNNs achieve state-of-the-art accuracy, demonstrating their effectiveness and promise for brain–computer interface and related applications.

Technology Category

Application Category

📝 Abstract
Modelling dynamically evolving spatio-temporal signals is a prominent challenge in the Graph Neural Network (GNN) literature. Notably, GNNs assume an existing underlying graph structure. While this underlying structure may not always exist or is derived independently from the signal, a temporally evolving functional network can always be constructed from multi-channel data. Graph Variate Signal Analysis (GVSA) defines a unified framework consisting of a network tensor of instantaneous connectivity profiles against a stable support usually constructed from the signal itself. Building on GVSA and tools from graph signal processing, we introduce Graph-Variate Neural Networks (GVNNs): layers that convolve spatio-temporal signals with a signal-dependent connectivity tensor combining a stable long-term support with instantaneous, data-driven interactions. This design captures dynamic statistical interdependencies at each time step without ad hoc sliding windows and admits an efficient implementation with linear complexity in sequence length. Across forecasting benchmarks, GVNNs consistently outperform strong graph-based baselines and are competitive with widely used sequence models such as LSTMs and Transformers. On EEG motor-imagery classification, GVNNs achieve strong accuracy highlighting their potential for brain-computer interface applications.
Problem

Research questions and friction points this paper is trying to address.

Modeling dynamically evolving spatio-temporal signals with graph neural networks
Capturing dynamic statistical interdependencies without sliding windows
Handling cases where underlying graph structure may not exist
Innovation

Methods, ideas, or system contributions that make the work stand out.

Signal-dependent connectivity tensor combining stable and instantaneous interactions
Dynamic statistical interdependencies captured without sliding windows
Linear complexity implementation for efficient sequence processing
🔎 Similar Papers
No similar papers found.
O
Om Roy
Department of Computer and Information Sciences, University of Strathclyde, Glasgow, Scotland, United Kingdom, G1 1XQ
Yashar Moshfeghi
Yashar Moshfeghi
University of Strathclyde
Generative AIPredictive AIBrain-Computer InterfacesNeuraSearchInformation Retrieval
K
Keith Smith
Department of Computer and Information Sciences, University of Strathclyde, Glasgow, Scotland, United Kingdom, G1 1XQ