Delays in Spiking Neural Networks: A State Space Model Approach

📅 2025-12-01
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
How to efficiently model finite temporal memory in spiking neural networks (SNNs)? This work proposes a general state-space modeling paradigm that explicitly encodes input history via learnable auxiliary state variables, enabling neurons to dynamically access temporally localized information within a finite time window—without increasing event-driven computational overhead. The framework is orthogonal to standard neuron models (e.g., leaky integrate-and-fire) and supports end-to-end learning of both delay duration and parameters. Evaluated on the Spiking Heidelberg Digits (SHD) benchmark, our method achieves state-of-the-art performance among delay-aware SNNs. Notably, it yields substantial accuracy gains for compact network architectures, demonstrating its effectiveness and generalizability in enhancing temporal modeling capability while preserving computational efficiency.

Technology Category

Application Category

📝 Abstract
Spiking neural networks (SNNs) are biologically inspired, event-driven models that are suitable for processing temporal data and offer energy-efficient computation when implemented on neuromorphic hardware. In SNNs, richer neuronal dynamic allows capturing more complex temporal dependencies, with delays playing a crucial role by allowing past inputs to directly influence present spiking behavior. We propose a general framework for incorporating delays into SNNs through additional state variables. The proposed mechanism enables each neuron to access a finite temporal input history. The framework is agnostic to neuron models and hence can be seamlessly integrated into standard spiking neuron models such as LIF and adLIF. We analyze how the duration of the delays and the learnable parameters associated with them affect the performance. We investigate the trade-offs in the network architecture due to additional state variables introduced by the delay mechanism. Experiments on the Spiking Heidelberg Digits (SHD) dataset show that the proposed mechanism matches the performance of existing delay-based SNNs while remaining computationally efficient. Moreover, the results illustrate that the incorporation of delays may substantially improve performance in smaller networks.
Problem

Research questions and friction points this paper is trying to address.

Incorporating delays into spiking neural networks to capture temporal dependencies.
Analyzing how delay parameters and network architecture affect performance.
Improving efficiency and performance in smaller networks using delays.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Incorporating delays via additional state variables
Enabling neurons to access finite temporal input history
Seamlessly integrating delays into standard spiking neuron models
🔎 Similar Papers
No similar papers found.
S
Sanja Karilanova
Department of Electrical Engineering, Uppsala University, Sweden
Subhrakanti Dey
Subhrakanti Dey
Professor, IEEE Fellow, Uppsala University, Sweden
distributed optimization and learningnetworked control systemscyber physical security
A
Ayça Özçelikkale
Department of Electrical Engineering, Uppsala University, Sweden