ASRC-SNN: Adaptive Skip Recurrent Connection Spiking Neural Network

📅 2025-05-16
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the gradient vanishing problem in recurrent spiking neural networks (RSNNs) arising from the decoupled analysis of neuronal dynamics and recurrent connectivity, which severely impairs long-term temporal modeling. We systematically characterize, for the first time, the underlying gradient decay mechanism across timesteps. To overcome this limitation, we propose a unified dynamical systems perspective that jointly models neurons and recurrent connections, and introduce the skip-recurrent connection (SRC) architecture—along with its adaptive variant (ASRC)—which learns optimal skip distances to break fixed-span constraints. By integrating gradient flow modeling with adaptive span optimization, our approach significantly enhances temporal representation capability and robustness. On multiple standard sequential benchmarks, the proposed model achieves over 12% higher accuracy than baseline RSNNs and SRC-SNNs, and improves noise robustness by 35%.

Technology Category

Application Category

📝 Abstract
In recent years, Recurrent Spiking Neural Networks (RSNNs) have shown promising potential in long-term temporal modeling. Many studies focus on improving neuron models and also integrate recurrent structures, leveraging their synergistic effects to improve the long-term temporal modeling capabilities of Spiking Neural Networks (SNNs). However, these studies often place an excessive emphasis on the role of neurons, overlooking the importance of analyzing neurons and recurrent structures as an integrated framework. In this work, we consider neurons and recurrent structures as an integrated system and conduct a systematic analysis of gradient propagation along the temporal dimension, revealing a challenging gradient vanishing problem. To address this issue, we propose the Skip Recurrent Connection (SRC) as a replacement for the vanilla recurrent structure, effectively mitigating the gradient vanishing problem and enhancing long-term temporal modeling performance. Additionally, we propose the Adaptive Skip Recurrent Connection (ASRC), a method that can learn the skip span of skip recurrent connection in each layer of the network. Experiments show that replacing the vanilla recurrent structure in RSNN with SRC significantly improves the model's performance on temporal benchmark datasets. Moreover, ASRC-SNN outperforms SRC-SNN in terms of temporal modeling capabilities and robustness.
Problem

Research questions and friction points this paper is trying to address.

Address gradient vanishing in recurrent spiking neural networks
Propose adaptive skip connections for long-term temporal modeling
Enhance RSNN performance and robustness via integrated neuron-structure analysis
Innovation

Methods, ideas, or system contributions that make the work stand out.

Proposes Skip Recurrent Connection (SRC) for gradient vanishing
Introduces Adaptive Skip Recurrent Connection (ASRC)
Enhances long-term temporal modeling in SNNs
🔎 Similar Papers
No similar papers found.
S
Shang Xu
College of Computer Science, Zhejiang University
J
Jiayu Zhang
College of Computer Science, Zhejiang University
Z
Ziming Wang
College of Computer Science, Zhejiang University
Runhao Jiang
Runhao Jiang
Zhejiang University
Neuromophic ComputingSpiking Neuron NetworkDeep learning
R
Rui Yan
College of Computer Science, Zhejiang University of Technology
Huajin Tang
Huajin Tang
Zhejiang University, China
Brain-inspired AIneuroroboticsspiking neural networksbrain-inspired computing