🤖 AI Summary
For graph time-series reasoning tasks—such as urban water networks, economic systems, and network neuroscience—this paper proposes the Graph-aware State Space Model (GSSM), a unified framework for jointly modeling graph structure and temporal dynamics. Methodologically: (i) state evolution is formulated via graph stochastic partial differential equations, incorporating edge-wise noise diffusion to capture local uncertainty; (ii) an observation model based on multi-hop neighborhood graph filtering integrates physical priors with data-driven learning; (iii) end-to-end training is enabled by unifying Kalman filtering theory with maximum likelihood estimation. Under partial observability, GSSM significantly improves forecasting accuracy and missing-value imputation performance. It offers strong interpretability, high representational capacity, and linear scalability in graph size—establishing a novel paradigm for complex spatiotemporal graph modeling.
📝 Abstract
Inference tasks with time series over graphs are of importance in applications such as urban water networks, economics, and networked neuroscience. Addressing these tasks typically relies on identifying a computationally affordable model that jointly captures the graph-temporal patterns of the data. In this work, we propose a graph-aware state space model for graph time series, where both the latent state and the observation equation are parametric graph-induced models with a limited number of parameters that need to be learned. More specifically, we consider the state equation to follow a stochastic partial differential equation driven by noise over the graphs edges accounting not only for potential edge uncertainties but also for increasing the degrees of freedom in the latter in a tractable manner. The graph structure conditioning of the noise dispersion allows the state variable to deviate from the stochastic process in certain neighborhoods. The observation model is a sampled and graph-filtered version of the state capturing multi-hop neighboring influence. The goal is to learn the parameters in both state and observation models from the partially observed data for downstream tasks such as prediction and imputation. The model is inferred first through a maximum likelihood approach that provides theoretical tractability but is limited in expressivity and scalability. To improve on the latter, we use the state-space formulation to build a principled deep learning architecture that jointly learns the parameters and tracks the state in an end-to-end manner in the spirit of Kalman neural networks.