Towards Accurate Binary Spiking Neural Networks: Learning with Adaptive Gradient Modulation Mechanism

📅 2025-02-20
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Binary spiking neural networks (BSNNs) suffer from slow convergence and low accuracy due to frequent weight sign flips—caused by binary weight constraints and non-differentiable spiking dynamics—disrupting optimization stability. Method: We propose the Adaptive Gradient Modulation Mechanism (AGMM), the first approach to theoretically characterize the intrinsic dynamical origin of sign flips in BSNNs. AGMM jointly regulates gradient scaling and adaptive learning rates to dynamically suppress gradient shocks, enhancing optimization robustness. It integrates spiking neural dynamics modeling with binary-constrained optimization. Results: On both static and neuromorphic benchmarks, AGMM achieves state-of-the-art accuracy for BSNNs, accelerates convergence by over 30%, reduces model size to 1/32 that of full-precision SNNs, and cuts inference energy consumption by more than 80%.

Technology Category

Application Category

📝 Abstract
Binary Spiking Neural Networks (BSNNs) inherit the eventdriven paradigm of SNNs, while also adopting the reduced storage burden of binarization techniques. These distinct advantages grant BSNNs lightweight and energy-efficient characteristics, rendering them ideal for deployment on resource-constrained edge devices. However, due to the binary synaptic weights and non-differentiable spike function, effectively training BSNNs remains an open question. In this paper, we conduct an in-depth analysis of the challenge for BSNN learning, namely the frequent weight sign flipping problem. To mitigate this issue, we propose an Adaptive Gradient Modulation Mechanism (AGMM), which is designed to reduce the frequency of weight sign flipping by adaptively adjusting the gradients during the learning process. The proposed AGMM can enable BSNNs to achieve faster convergence speed and higher accuracy, effectively narrowing the gap between BSNNs and their full-precision equivalents. We validate AGMM on both static and neuromorphic datasets, and results indicate that it achieves state-of-the-art results among BSNNs. This work substantially reduces storage demands and enhances SNNs' inherent energy efficiency, making them highly feasible for resource-constrained environments.
Problem

Research questions and friction points this paper is trying to address.

BSNNs struggle with frequent weight sign flipping
Training BSNNs effectively remains unresolved
AGMM reduces weight sign flipping frequency
Innovation

Methods, ideas, or system contributions that make the work stand out.

Adaptive Gradient Modulation Mechanism
Binary Spiking Neural Networks
Reduced weight sign flipping
🔎 Similar Papers
No similar papers found.
Y
Yu Liang
University of Electronic Science and Technology of China
Wenjie Wei
Wenjie Wei
University of Electronic Science and Technology of China
Spiking Neural NetworkNeuromorphic ComputingModel CompressionEvent-based Vision
A
A. Belatreche
Northumbria University
Honglin Cao
Honglin Cao
The University of Electronic Science and Technology of China
Spiking Neuron NetworkModel Compression
Z
Zijian Zhou
University of Electronic Science and Technology of China
S
Shuai Wang
University of Electronic Science and Technology of China
M
Malu Zhang
University of Electronic Science and Technology of China
Y
Yang Yang
University of Electronic Science and Technology of China