DSSRNN: Decomposition-Enhanced State-Space Recurrent Neural Network for Time-Series Analysis

๐Ÿ“… 2024-12-01
๐Ÿ›๏ธ arXiv.org
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
Addressing the longstanding trade-off between accuracy and efficiency in time-series forecasting, this paper proposes a novel framework integrating interpretable decomposition, physics-guided state-space modeling, and a lightweight RNN. It innovatively unifies STL decomposition, physics-informed constraints, and state-space RNNs to explicitly model trend, seasonality, and COโ‚‚ diffusion dynamics governed by physical laws. Inference is optimized via MACs-aware design, substantially reducing computational overhead. Evaluated on an indoor air quality dataset, the method achieves MSE = 0.378 and MAE = 0.401 for 96-step-ahead forecasting (Office 1), outperforming state-of-the-art Transformer-based models. With only 0.58 ms inference latency, 437 MiB memory footprint, and 0.11 G MACs, it delivers high prediction accuracy, strong interpretability, and ultra-low deployment costโ€”enabling efficient edge deployment without sacrificing fidelity.

Technology Category

Application Category

๐Ÿ“ Abstract
Time series forecasting is a crucial yet challenging task in machine learning, requiring domain-specific knowledge due to its wide-ranging applications. While recent Transformer models have improved forecasting capabilities, they come with high computational costs. Linear-based models have shown better accuracy than Transformers but still fall short of ideal performance. To address these challenges, we introduce the Decomposition State-Space Recurrent Neural Network (DSSRNN), a novel framework designed for both long-term and short-term time series forecasting. DSSRNN uniquely combines decomposition analysis to capture seasonal and trend components with state-space models and physics-based equations. We evaluate DSSRNN's performance on indoor air quality datasets, focusing on CO2 concentration prediction across various forecasting horizons. Results demonstrate that DSSRNN consistently outperforms state-of-the-art models, including transformer-based architectures, in terms of both Mean Squared Error (MSE) and Mean Absolute Error (MAE). For example, at the shortest horizon (T=96) in Office 1, DSSRNN achieved an MSE of 0.378 and an MAE of 0.401, significantly lower than competing models. Additionally, DSSRNN exhibits superior computational efficiency compared to more complex models. While not as lightweight as the DLinear model, DSSRNN achieves a balance between performance and efficiency, with only 0.11G MACs and 437MiB memory usage, and an inference time of 0.58ms for long-term forecasting. This work not only showcases DSSRNN's success but also establishes a new benchmark for physics-informed machine learning in environmental forecasting and potentially other domains.
Problem

Research questions and friction points this paper is trying to address.

Balancing accuracy and efficiency in time series forecasting
Overcoming high computational costs of Transformer models
Improving performance in CO2 concentration prediction tasks
Innovation

Methods, ideas, or system contributions that make the work stand out.

Physics-informed adaptive decomposition in RNN
Separates seasonal and trend components
Embeds domain equations in recurrent framework
๐Ÿ”Ž Similar Papers
No similar papers found.
Ahmad Mohammadshirazi
Ahmad Mohammadshirazi
The Ohio State University
Time Series AnalysisLLMsVLMsGenAI
A
Ali Nosrati Firoozsalari
Shahid Beheshti University
R
R. Ramnath
The Ohio State University