STEM: Spatial-Temporal Mapping Tool For Spiking Neural Networks

📅 2025-02-05
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the escalating off-chip data movement and storage overhead in spiking neural networks (SNNs) deployed on hardware—caused by time-evolving neuron states—which erodes their energy-efficiency advantage. We present the first systematic quantification of how neuron state dynamics impact off-chip data traffic and energy consumption. To mitigate this, we propose a memory-hierarchy-aware SNN state modeling methodology, coupled with a spatial-temporal joint mapping optimization strategy and a dedicated design space exploration (DSE) framework that enables intra- and inter-layer state reuse as well as safe state omission. Evaluated on two event-camera SNN benchmarks, our approach achieves up to 12× reduction in off-chip data movement and 5× lower energy consumption. On one benchmark, it reduces neuron state volume by 20× while accelerating inference by 1.4×, all without accuracy loss.

Technology Category

Application Category

📝 Abstract
Spiking Neural Networks (SNNs) are promising bio-inspired third-generation neural networks. Recent research has trained deep SNN models with accuracy on par with Artificial Neural Networks (ANNs). Although the event-driven and sparse nature of SNNs show potential for more energy efficient computation than ANNs, SNN neurons have internal states which evolve over time. Keeping track of SNN states can significantly increase data movement and storage requirements, potentially losing its advantages with respect to ANNs. This paper investigates the energy effects of having neuron states, and how it is influenced by the chosen mapping to realistic hardware architectures with advanced memory hierarchies. Therefore, we develop STEMS, a mapping design space exploration tool for SNNs. STEMS models SNN's stateful behavior and explores intra-layer and inter-layer mapping optimizations to minimize data movement, considering both spatial and temporal SNN dimensions. Using STEMS, we show up to 12x reduction in off-chip data movement and 5x reduction in energy (on top of intra-layer optimizations), on two event-based vision SNN benchmarks. Finally, neuron states may not be needed for all SNN layers. By optimizing neuron states for one of our benchmarks, we show 20x reduction in neuron states and 1.4x better performance without accuracy loss.
Problem

Research questions and friction points this paper is trying to address.

Energy effects of SNN neuron states
Mapping SNNs to hardware architectures
Reducing data movement and energy in SNNs
Innovation

Methods, ideas, or system contributions that make the work stand out.

STEMS tool for SNN mapping
Minimizes data movement
Optimizes neuron states efficiency
🔎 Similar Papers
No similar papers found.