Spatial Spiking Neural Networks Enable Efficient and Robust Temporal Computation

📅 2025-12-10
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address synaptic delay parameter redundancy, biological implausibility, and poor hardware compatibility in spiking neural networks (SNNs), this paper proposes Spatial SNNs (SpSNNs): neurons are embedded in Euclidean space (2D/3D), and synaptic delays are derived intrinsically from inter-neuronal geometric distances—replacing conventional per-synapse trainable delays. This is the first work to employ spatial embedding as a biologically plausible, parameter-efficient delay generation mechanism. We introduce geometric regularization, enabling dynamic sparsification with up to 90% connection pruning without accuracy loss. SpSNNs support arbitrary spiking neuron models, exact gradient computation via automatic differentiation, and direct neuromorphic hardware mapping. Experiments on Yin-Yang and Spiking Heidelberg Digits datasets show that SpSNNs reduce parameter count to 1/18 of unconstrained-delay SNNs while achieving optimal performance with 2D or 3D embeddings.

Technology Category

Application Category

📝 Abstract
The efficiency of modern machine intelligence depends on high accuracy with minimal computational cost. In spiking neural networks (SNNs), synaptic delays are crucial for encoding temporal structure, yet existing models treat them as fully trainable, unconstrained parameters, leading to large memory footprints, higher computational demand, and a departure from biological plausibility. In the brain, however, delays arise from physical distances between neurons embedded in space. Building on this principle, we introduce Spatial Spiking Neural Networks (SpSNNs), a framework in which neurons learn coordinates in a finite-dimensional Euclidean space and delays emerge from inter-neuron distances. This replaces per-synapse delay learning with position learning, substantially reducing parameter count while retaining temporal expressiveness. Across the Yin-Yang and Spiking Heidelberg Digits benchmarks, SpSNNs outperform SNNs with unconstrained delays despite using far fewer parameters. Performance consistently peaks in 2D and 3D networks rather than infinite-dimensional delay spaces, revealing a geometric regularization effect. Moreover, dynamically sparsified SpSNNs maintain full accuracy even at 90% sparsity, matching standard delay-trained SNNs while using up to 18x fewer parameters. Because learned spatial layouts map naturally onto hardware geometries, SpSNNs lend themselves to efficient neuromorphic implementation. Methodologically, SpSNNs compute exact delay gradients via automatic differentiation with custom-derived rules, supporting arbitrary neuron models and architectures. Altogether, SpSNNs provide a principled platform for exploring spatial structure in temporal computation and offer a hardware-friendly substrate for scalable, energy-efficient neuromorphic intelligence.
Problem

Research questions and friction points this paper is trying to address.

Reduces memory and computational costs in spiking neural networks
Replaces trainable delays with spatial coordinates for biological plausibility
Enables efficient neuromorphic hardware implementation through geometric regularization
Innovation

Methods, ideas, or system contributions that make the work stand out.

Neurons learn spatial coordinates to derive synaptic delays
Replaces per-synapse delay learning with position learning
Enables efficient neuromorphic implementation via hardware mapping
L
Lennart P. L. Landsmeer
Department of Quantum and Computer Engineering, Delft University of Technology, Mekelweg 4, 2628 CD Delft, NL; NeuroComputingLab, Department of Neuroscience, Erasmus Medical Center, Dr. Molewaterplein 40, 3015 GD Rotterdam, NL
A
Amirreza Movahedin
Department of Quantum and Computer Engineering, Delft University of Technology, Mekelweg 4, 2628 CD Delft, NL; NeuroComputingLab, Department of Neuroscience, Erasmus Medical Center, Dr. Molewaterplein 40, 3015 GD Rotterdam, NL
Mario Negrello
Mario Negrello
NeuroComputingLab, Department of Neuroscience, Erasmus Medical Center, Dr. Molewaterplein 40, 3015 GD Rotterdam, NL; Department of Quantum and Computer Engineering, Delft University of Technology, Mekelweg 4, 2628 CD Delft, NL
Said Hamdioui
Said Hamdioui
Delft University of Technology
Computation-in-memoryBrain-inspired computingMemory TestHardware dependability
Christos Strydis
Christos Strydis
NeuroComputingLab, Department of Neuroscience, Erasmus Medical Center, Dr. Molewaterplein 40, 3015 GD Rotterdam, NL; Department of Quantum and Computer Engineering, Delft University of Technology, Mekelweg 4, 2628 CD Delft, NL