Extending Spike-Timing Dependent Plasticity to Learning Synaptic Delays

📅 2025-06-17
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Synaptic delay—long overlooked in neuromorphic computing and difficult to co-optimize with synaptic weights—has hindered the biological plausibility and performance of spiking neural networks (SNNs). Method: This paper proposes an extended spike-timing-dependent plasticity (STDP) rule that, for the first time, treats synaptic delay as a learnable parameter within an unsupervised online learning framework, enabling joint optimization of both synaptic weights and delays. Contribution/Results: The method uncovers a functional coupling between synaptic efficacy and transmission delay, breaking the conventional constraint of weight-only updates in SNNs. Evaluated on multiple benchmark tasks, it consistently outperforms existing co-learning approaches and standard STDP, achieving sustained improvements in classification accuracy. Crucially, it provides empirically verifiable evidence for the functional role of synaptic delay—demonstrating its significance beyond mere biological realism and establishing a foundation for delay-aware neuromorphic learning.

Technology Category

Application Category

📝 Abstract
Synaptic delays play a crucial role in biological neuronal networks, where their modulation has been observed in mammalian learning processes. In the realm of neuromorphic computing, although spiking neural networks (SNNs) aim to emulate biology more closely than traditional artificial neural networks do, synaptic delays are rarely incorporated into their simulation. We introduce a novel learning rule for simultaneously learning synaptic connection strengths and delays, by extending spike-timing dependent plasticity (STDP), a Hebbian method commonly used for learning synaptic weights. We validate our approach by extending a widely-used SNN model for classification trained with unsupervised learning. Then we demonstrate the effectiveness of our new method by comparing it against another existing methods for co-learning synaptic weights and delays as well as against STDP without synaptic delays. Results demonstrate that our proposed method consistently achieves superior performance across a variety of test scenarios. Furthermore, our experimental results yield insight into the interplay between synaptic efficacy and delay.
Problem

Research questions and friction points this paper is trying to address.

Extending STDP to learn synaptic delays in SNNs
Co-learning synaptic weights and delays effectively
Studying interplay between synaptic efficacy and delay
Innovation

Methods, ideas, or system contributions that make the work stand out.

Extends STDP to learn synaptic delays
Simultaneously learns synaptic strengths and delays
Validated via unsupervised SNN classification model
M
Marissa Dominijanni
State University of New York at Buffalo
Alexander G. Ororbia
Alexander G. Ororbia
Rochester Institute of Technology
computational neurosciencecognitive sciencepredictive processingactive inference
K
Kenneth W. Regan
State University of New York at Buffalo