SpikySpace: A Spiking State Space Model for Energy-Efficient Time Series Forecasting

📅 2026-01-02
🏛️ arXiv.org
📈 Citations: 1
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of achieving both low power consumption and high energy efficiency in edge-based time series forecasting with spiking neural networks, where existing approaches relying on complex Transformer architectures fall short. To this end, we propose SpikySpace—the first fully spiking state space model—that reduces attention complexity from quadratic to linear through selective scanning and performs sparse updates exclusively on spike events, thereby eliminating dense matrix multiplications while preserving structured memory capabilities. The model further incorporates SiLU and Softplus approximations that are amenable to neuromorphic hardware, effectively bridging efficient spiking computation with modern sequence modeling. Experiments demonstrate that SpikySpace achieves competitive prediction accuracy while reducing energy consumption by 98.73% and 96.24% compared to iTransformer and iSpikformer, respectively, and substantially lowering memory access overhead.

Technology Category

Application Category

📝 Abstract
Time-series forecasting in domains like traffic management and industrial monitoring often requires real-time, energy-efficient processing on edge devices with limited resources. Spiking neural networks (SNNs) offer event-driven computation and ultra-low power and have been proposed for use in this space. Unfortunately, existing SNN-based time-series forecasters often use complex transformer blocks. To address this issue, we propose SpikySpace, a spiking state-space model (SSM) that reduces the quadratic cost in the attention block to linear time via spiking selective scanning. Further, we introduce PTsoftplus and PTSiLU, two efficient approximations of SiLU and Softplus that replace costly exponential and division operations with simple bit-shifts. Evaluated on four multivariate time-series benchmarks, SpikySpace outperforms the leading SNN in terms of accuracy by up to 3.0% while reducing energy consumption by over 96.1%. As the first fully spiking state-space model, SpikySpace bridges neuromorphic efficiency with modern sequence modeling, opening a practical path toward efficient time series forecasting systems. Our code is available at https://anonymous.4open.science/r/SpikySpace.
Problem

Research questions and friction points this paper is trying to address.

time series forecasting
energy efficiency
spiking neural networks
edge computing
neuromorphic computing
Innovation

Methods, ideas, or system contributions that make the work stand out.

Spiking Neural Networks
State Space Model
Energy-Efficient Forecasting
Neuromorphic Computing
Selective Scanning