Integer Binary-Range Alignment Neuron for Spiking Neural Networks

📅 2025-06-06
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Spiking Neural Networks (SNNs) suffer from limited representational capacity due to constrained spiking neuron dynamics, leading to substantial performance gaps versus Artificial Neural Networks (ANNs) in image classification and object detection. To address this, we propose Integer Binary Leaky Integrate-and-Fire (IB-LIF) neurons, introducing the first integer-domain binary LIF mechanism coupled with a dynamic range alignment strategy. This synergy enables virtual temporal expansion and high-magnitude spike activation, effectively overcoming SNN representational bottlenecks. Crucially, our method preserves intrinsic spiking computation and ultra-low-power advantages. On ImageNet, the resulting SNN achieves 74.19% top-1 accuracy—surpassing prior state-of-the-art by 3.45%. On COCO, it attains 66.2% mAP@50 and 49.1% mAP@50:95, outperforming previous best results by 1.6% and 1.8%, respectively. Moreover, it delivers a 6.3× improvement in energy efficiency.

Technology Category

Application Category

📝 Abstract
Spiking Neural Networks (SNNs) are noted for their brain-like computation and energy efficiency, but their performance lags behind Artificial Neural Networks (ANNs) in tasks like image classification and object detection due to the limited representational capacity. To address this, we propose a novel spiking neuron, Integer Binary-Range Alignment Leaky Integrate-and-Fire to exponentially expand the information expression capacity of spiking neurons with only a slight energy increase. This is achieved through Integer Binary Leaky Integrate-and-Fire and range alignment strategy. The Integer Binary Leaky Integrate-and-Fire allows integer value activation during training and maintains spike-driven dynamics with binary conversion expands virtual timesteps during inference. The range alignment strategy is designed to solve the spike activation limitation problem where neurons fail to activate high integer values. Experiments show our method outperforms previous SNNs, achieving 74.19% accuracy on ImageNet and 66.2% mAP@50 and 49.1% mAP@50:95 on COCO, surpassing previous bests with the same architecture by +3.45% and +1.6% and +1.8%, respectively. Notably, our SNNs match or exceed ANNs' performance with the same architecture, and the energy efficiency is improved by 6.3${ imes}$.
Problem

Research questions and friction points this paper is trying to address.

Enhance SNN performance in image tasks
Expand spiking neuron information capacity
Improve energy efficiency in neural networks
Innovation

Methods, ideas, or system contributions that make the work stand out.

Integer Binary Leaky Integrate-and-Fire neuron
Range alignment strategy for activation
Expands virtual timesteps with binary conversion
🔎 Similar Papers
No similar papers found.
B
Binghao Ye
Shenzhen Institutes of Advanced Technology, CAS; University of Chinese Academy of Sciences
W
Wenjuan Li
State Key Laboratory of Multimodal Artificial Intelligence Systems, CASIA; PeopleAI Inc. Beijing, China
D
Dong Wang
Shenzhen Institutes of Advanced Technology, CAS
M
Man Yao
Institute of Automation, Chinese Academy of Sciences
B
Bing Li
State Key Laboratory of Multimodal Artificial Intelligence Systems, CASIA; PeopleAI Inc. Beijing, China
W
Weiming Hu
State Key Laboratory of Multimodal Artificial Intelligence Systems, CASIA; School of Information Science and Technology, ShanghaiTech University
D
Dong Liang
Shenzhen Institutes of Advanced Technology, CAS
Kun Shang
Kun Shang
Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences
OptimizationSNNAI