Synchrony-Gated Plasticity with Dopamine Modulation for Spiking Neural Networks

📅 2025-12-08
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the conflicts between local plasticity and supervised objectives, high memory overhead, and difficulty in integrating biological signals in deep Spiking Neural Networks (SNNs), this paper proposes DA-SSDP—a local plasticity mechanism integrating dopamine neuromodulation and spike synchrony. Its core innovation is synchrony gating: a fixed, batch-level gating mechanism that correlates synchrony metrics with loss to automatically assess the validity of local signals; when invalid, it degrades to a lightweight regularization. DA-SSDP stores only binary spike indicators and Gaussian-kernel-encoded first-spike latencies, drastically reducing memory footprint. It is fully compatible with standard surrogate gradient backpropagation without requiring architectural modifications. Evaluated on CIFAR-10/100, CIFAR10-DVS, and ImageNet-1K, DA-SSDP improves top-1 accuracy by 0.42%, 0.99%, 0.1%, and 0.73%, respectively, with controlled increases in training cost and consistently enhanced performance.

Technology Category

Application Category

📝 Abstract
While surrogate backpropagation proves useful for training deep spiking neural networks (SNNs), incorporating biologically inspired local signals on a large scale remains challenging. This difficulty stems primarily from the high memory demands of maintaining accurate spike-timing logs and the potential for purely local plasticity adjustments to clash with the supervised learning goal. To effectively leverage local signals derived from spiking neuron dynamics, we introduce Dopamine-Modulated Spike-Synchrony-Dependent Plasticity (DA-SSDP), a synchrony-based rule that is sensitive to loss and brings a synchrony-based local learning signal to the model. DA-SSDP condenses spike patterns into a synchrony metric at the batch level. An initial brief warm-up phase assesses its relationship to the task loss and sets a fixed gate that subsequently adjusts the local update's magnitude. In cases where synchrony proves unrelated to the task, the gate settles at one, simplifying DA-SSDP to a basic two-factor synchrony mechanism that delivers minor weight adjustments driven by concurrent spike firing and a Gaussian latency function. These small weight updates are only added to the network`s deeper layers following the backpropagation phase, and our tests showed this simplified version did not degrade performance and sometimes gave a small accuracy boost, serving as a regularizer during training. The rule stores only binary spike indicators and first-spike latencies with a Gaussian kernel. Without altering the model structure or optimization routine, evaluations on benchmarks like CIFAR-10 (+0.42%), CIFAR-100 (+0.99%), CIFAR10-DVS (+0.1%), and ImageNet-1K (+0.73%) demonstrated consistent accuracy gains, accompanied by a minor increase in computational overhead. Our code is available at https://github.com/NeuroSyd/DA-SSDP.
Problem

Research questions and friction points this paper is trying to address.

Integrates local biological signals into deep spiking neural network training
Reduces memory demands by condensing spike patterns into synchrony metrics
Enhances accuracy without altering model structure or optimization routine
Innovation

Methods, ideas, or system contributions that make the work stand out.

DA-SSDP uses synchrony metric and dopamine modulation for local learning
It gates plasticity based on synchrony-loss relationship to adjust updates
Stores only binary spikes and latencies, adding updates after backpropagation
🔎 Similar Papers
No similar papers found.
Yuchen Tian
Yuchen Tian
HKBU
Code Intelligence
S
Samuel Tensingh
School of Biomedical Engineering, The University of Sydney, Sydney, NSW, Australia
J
Jason Eshraghian
Dept. of Electrical and Computer Engineering, University of California, Santa Cruz, CA, USA
N
Nhan Duy Truong
School of Biomedical Engineering, The University of Sydney, Sydney, NSW, Australia
Omid Kavehei
Omid Kavehei
The University of Sydney
nanoelectronicsmedical electronicsaffective computinglearning machinesintegrated circuit design