LightSNN: Lightweight Architecture Search for Sparse and Accurate Spiking Neural Networks

📅 2025-03-27
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the suboptimal accuracy and energy efficiency of spiking neural networks (SNNs) stemming from direct adoption of artificial neural network (ANN) architectures, this paper proposes a lightweight neural architecture search (NAS) method tailored for edge devices. Our approach features three key contributions: (1) a training-free pruning-based NAS mechanism that drastically reduces search overhead; (2) a spike-aware Hamming distance fitness metric that precisely quantifies spatiotemporal sparsity in SNNs; and (3) a cell-level search space incorporating backward connections to explicitly model the dynamic sparsity inherent to SNNs. Evaluated on CIFAR-10/100 and DVS128-Gesture, our method achieves state-of-the-art accuracy—improving classification accuracy by 4.49% on DVS128-Gesture—and accelerates architecture search by 98× over SNASNet, outperforming the best baseline by 30% in search efficiency.

Technology Category

Application Category

📝 Abstract
Spiking Neural Networks (SNNs) are highly regarded for their energy efficiency, inherent activation sparsity, and suitability for real-time processing in edge devices. However, most current SNN methods adopt architectures resembling traditional artificial neural networks (ANNs), leading to suboptimal performance when applied to SNNs. While SNNs excel in energy efficiency, they have been associated with lower accuracy levels than traditional ANNs when utilizing conventional architectures. In response, in this work we present LightSNN, a rapid and efficient Neural Network Architecture Search (NAS) technique specifically tailored for SNNs that autonomously leverages the most suitable architecture, striking a good balance between accuracy and efficiency by enforcing sparsity. Based on the spiking NAS network (SNASNet) framework, a cell-based search space including backward connections is utilized to build our training-free pruning-based NAS mechanism. Our technique assesses diverse spike activation patterns across different data samples using a sparsity-aware Hamming distance fitness evaluation. Thorough experiments are conducted on both static (CIFAR10 and CIFAR100) and neuromorphic datasets (DVS128-Gesture). Our LightSNN model achieves state-of-the-art results on CIFAR10 and CIFAR100, improves performance on DVS128Gesture by 4.49%, and significantly reduces search time, most notably offering a 98x speedup over SNASNet and running 30% faster than the best existing method on DVS128Gesture.
Problem

Research questions and friction points this paper is trying to address.

Optimizes SNN architecture for better accuracy and efficiency
Reduces search time for SNN architecture design
Improves performance on static and neuromorphic datasets
Innovation

Methods, ideas, or system contributions that make the work stand out.

Pruning-based NAS for SNN architecture search
Sparsity-aware Hamming distance fitness evaluation
Cell-based search space with backward connections
🔎 Similar Papers
No similar papers found.
Y
Yesmine Abdennadher
Department of Information Engineering (DEI), University of Padova, Padova, Italy
G
Giovanni Perin
Department of Information Engineering (DII), University of Brescia, Brescia, Italy
R
Riccardo Mazzieri
Department of Information Engineering (DEI), University of Padova, Padova, Italy
Jacopo Pegoraro
Jacopo Pegoraro
Department of Information Engineering, University of Padova
Wireless SensingSignal ProcessingmmWaveJoint Communication and Sensing
Michele Rossi
Michele Rossi
Dept. of Information Engineering - University of Padova, Italy
edge computinggreen mobile networkswireless sensingmachine learningoptimization