Reduced-Rank Autoregressive Model for High-Dimensional Multivariate Network Time Series

📅 2026-01-04
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the limitation of existing network autoregressive models, which treat nodes as scalar processes and thus fail to capture cross-variable interactions in high-dimensional multivariate network time series. To overcome this, we propose the Low-Rank Network Autoregressive (RRNAR) model, which integrates a known network topology with a learnable low-rank latent subspace through a separable bilinear transition structure. This design effectively captures cross-dimensional propagation mechanisms while avoiding the curse of dimensionality. Theoretical analysis reveals a “blessing of dimensionality”: under sparse network structures, estimation accuracy improves as the network size grows, and we establish non-asymptotic error bounds based on a novel distance metric. Coupled with a tailored Scaled Gradient Descent (ScaledGD) algorithm, RRNAR significantly outperforms baseline methods on real-world traffic and server monitoring datasets, successfully uncovering latent cross-channel propagation pathways.

Technology Category

Application Category

📝 Abstract
Multivariate network time series are ubiquitous in modern systems, yet existing network autoregressive models typically treat nodes as scalar processes, ignoring cross-variable spillovers. To capture these complex interactions without the curse of dimensionality, we propose the Reduced-Rank Network Autoregressive (RRNAR) model. Our framework introduces a separable bilinear transition structure that couples the known network topology with a learnable low-rank variable subspace. We estimate the model using a novel Scaled Gradient Descent (ScaledGD) algorithm, explicitly designed to bridge the gap between rigid network scalars and flexible factor components. Theoretically, we establish non-asymptotic error bounds under a novel distance metric. A key finding is a network-induced blessing of dimensionality: for sparse networks, the estimation accuracy for network parameters improves as the network size grows. Applications to traffic and server monitoring networks demonstrate that RRNAR significantly outperforms univariate and unstructured benchmarks by identifying latent cross-channel propagation mechanisms.
Problem

Research questions and friction points this paper is trying to address.

multivariate network time series
cross-variable spillovers
high-dimensional
network autoregressive model
dimensionality curse
Innovation

Methods, ideas, or system contributions that make the work stand out.

Reduced-Rank Network Autoregressive
Scaled Gradient Descent
bilinear transition structure
network-induced blessing of dimensionality
multivariate network time series
🔎 Similar Papers
No similar papers found.