🤖 AI Summary
Modeling time-varying spillover effects, autolag persistence, and covariate impacts in network-structured time series remains challenging due to the entanglement of topology and temporal dynamics. Method: We propose the Network Time-Varying Parameter Vector Autoregression (NTVP-VAR) model, which represents lag matrices as time-varying linear combinations of graph operators—thereby decoupling network structure (interaction topology) from temporal evolution (interaction strength). It employs low-dimensional latent states to govern dynamic node interactions, autolags, and covariate responses, integrating graph differencing, stochastic coefficient evolution, and tensor low-rank constraints. Contribution/Results: Theoretically, we establish conditions for second-moment existence, network stability, and local stationarity under nonstationary latent states. Methodologically, NTVP-VAR unifies Gaussian/Poisson network VARs, graph-differenced ARIMA, and dynamic edge models. Empirically, it achieves superior predictive accuracy and interpretability on high-dimensional sparse networks.
📝 Abstract
Many modern time series arise on networks, where each component is attached to a node and interactions follow observed edges. Classical time-varying parameter VARs (TVP-VARs) treat all series symmetrically and ignore this structure, while network autoregressive models exploit a given graph but usually impose constant parameters and stationarity. We develop network state-space models in which a low-dimensional latent state controls time-varying network spillovers, own-lag persistence and nodal covariate effects. A key special case is a network time-varying parameter VAR (NTVP-VAR) that constrains each lag matrix to be a linear combination of known network operators, such as a row-normalised adjacency and the identity, and lets the associated coefficients evolve stochastically in time. The framework nests Gaussian and Poisson network autoregressions, network ARIMA models with graph differencing, and dynamic edge models driven by multivariate logistic regression. We give conditions ensuring that NTVP-VARs are well-defined in second moments despite nonstationary states, describe network versions of stability and local stationarity, and discuss shrinkage, thresholding and low-rank tensor structures for high-dimensional graphs. Conceptually, network state-space models separate where interactions may occur (the graph) from how strong they are at each time (the latent state), providing an interpretable alternative to both unstructured TVP-VARs and existing network time-series models.