Sequential Reservoir Computing for Efficient High-Dimensional Spatiotemporal Forecasting

📅 2026-01-01
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the limitations of traditional RNNs/LSTMs and conventional reservoir computing in high-dimensional spatiotemporal system prediction, where the former suffer from gradient-related training difficulties and memory bottlenecks, and the latter exhibit poor scalability with increasing input dimensionality. The authors propose a serialized reservoir computing architecture that decomposes a large reservoir into multiple interconnected smaller sub-reservoirs for the first time. By constructing a cascaded network with fixed random recurrent layers and a convex optimization-based readout mechanism, the approach eliminates backpropagation and substantially reduces computational and memory costs. While maintaining model simplicity, the method significantly enhances scalability and long-term dependency modeling for high-dimensional dynamical systems. Experiments on Lorenz63, 2D vorticity, and shallow water equations demonstrate 15–25% longer prediction horizons, 20–30% lower SSIM and RMSE errors, and training costs reduced by three orders of magnitude compared to RNNs/LSTMs.

Technology Category

Application Category

📝 Abstract
Forecasting high-dimensional spatiotemporal systems remains computationally challenging for recurrent neural networks (RNNs) and long short-term memory (LSTM) models due to gradient-based training and memory bottlenecks. Reservoir Computing (RC) mitigates these challenges by replacing backpropagation with fixed recurrent layers and a convex readout optimization, yet conventional RC architectures still scale poorly with input dimensionality. We introduce a Sequential Reservoir Computing (Sequential RC) architecture that decomposes a large reservoir into a series of smaller, interconnected reservoirs. This design reduces memory and computational costs while preserving long-term temporal dependencies. Using both low-dimensional chaotic systems (Lorenz63) and high-dimensional physical simulations (2D vorticity and shallow-water equations), Sequential RC achieves 15-25% longer valid forecast horizons, 20-30% lower error metrics (SSIM, RMSE), and up to three orders of magnitude lower training cost compared to LSTM and standard RNN baselines. The results demonstrate that Sequential RC maintains the simplicity and efficiency of conventional RC while achieving superior scalability for high-dimensional dynamical systems. This approach provides a practical path toward real-time, energy-efficient forecasting in scientific and engineering applications.
Problem

Research questions and friction points this paper is trying to address.

spatiotemporal forecasting
high-dimensional systems
reservoir computing
computational scalability
memory bottleneck
Innovation

Methods, ideas, or system contributions that make the work stand out.

Sequential Reservoir Computing
Spatiotemporal Forecasting
High-Dimensional Dynamical Systems
Computational Efficiency
Reservoir Decomposition
🔎 Similar Papers
No similar papers found.
A
A. A. Asanjan
USRA Research Institute for Advanced Computer Science (RIACS), California
Filip A. Wudarski
Filip A. Wudarski
Scientist at USRA in NASA QuAIL group
Quantum InformationOpen Quantum SystemsQuantum ComputationQuantum Machine LearningQuantum Chemistry
D
Daniel O'Connor
Standard Chartered Bank, 1 Basinghall Avenue, London, UK
S
Shaun Geaney
Standard Chartered Bank, 1 Basinghall Avenue, London, UK
E
Elena Strbac
Standard Chartered Bank, 1 Basinghall Avenue, London, UK
P
P. A. Lott
USRA Research Institute for Advanced Computer Science (RIACS), California
D
Davide Venturelli
USRA Research Institute for Advanced Computer Science (RIACS), California