Beyond Pairwise Plasticity: Group-Level Spike Synchrony Facilitates Efficient Learning in Spiking Neural Networks

📅 2025-04-14
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing spiking neural network (SNN) plasticity rules rely solely on isolated spike-pair correlations, neglecting population-level synchronous spiking—a biologically critical learning signal. Method: We propose Synchrony-Steered Dynamic Plasticity (SSDP), the first plasticity rule that explicitly models population-level spike synchrony as the core driver of synaptic updates, enabling event-driven synchrony detection, local updates grounded in spiking statistics, and hardware-agnostic integration. Contribution/Results: SSDP is the first plasticity mechanism for spiking Transformers that simultaneously satisfies biological plausibility and engineering feasibility. It reveals a novel principle wherein synchrony induces dynamical phase transitions—from chaotic to equilibrium regimes—thereby accelerating convergence. Evaluated on SNN-ResNet and SNN-Transformer, SSDP yields more stable training, superior generalization, lower energy consumption, and significantly enhanced robustness to dynamic noise—enabling efficient neuromorphic hardware deployment.

Technology Category

Application Category

📝 Abstract
Brain networks rely on precise spike timing and coordinated activity to support robust and energy-efficient learning. Inspired by these principles, spiking neural networks (SNNs) are widely regarded as promising candidates for low-power, event-driven computing. However, most biologically-inspired learning rules employed in SNNs, including spike-timing-dependent plasticity (STDP), rely on isolated spike pairs and lack sensitivity to population-level activity. This limits their stability and generalization, particularly in noisy and fast-changing environments. Motivated by biological observations that neural synchrony plays a central role in learning and memory, we introduce a spike-synchrony-dependent plasticity (SSDP) rule that adjusts synaptic weights based on the degree of coordinated firing among neurons. SSDP supports stable and scalable learning by encouraging neurons to form coherent activity patterns. One prominent outcome is a sudden transition from unstable to stable dynamics during training, suggesting that synchrony may drive convergence toward equilibrium firing regimes. We demonstrate SSDP's effectiveness across multiple network types, from minimal-layer models to spiking ResNets and SNN-Transformer. To our knowledge, this is the first application of a synaptic plasticity mechanism in a spiking transformer. SSDP operates in a fully event-driven manner and incurs minimal computational cost, making it well-suited for neuromorphic deployment. In this approach, local synaptic modifications are associated with the collective dynamics of neural networks, resulting in a learning strategy that adheres to biological principles while maintaining practical efficiency, these findings position SSDP as a general-purpose optimization strategy for SNNs, while offering new insights into population-based learning mechanisms in the brain.
Problem

Research questions and friction points this paper is trying to address.

Develops spike-synchrony-dependent plasticity for biological learning patterns
Addresses inefficiency of isolated spike pair training methods
Enhances robustness against spike-time jitter and event noise
Innovation

Methods, ideas, or system contributions that make the work stand out.

SSDP adjusts weights based on synchronous firing patterns
Local post-optimization mechanism for sparse parameters
Integrates with backpropagation while preserving computation graph
🔎 Similar Papers
No similar papers found.
Yuchen Tian
Yuchen Tian
HKBU
Code Intelligence
A
Assel Kembay
Department of Electrical and Computer Engineering, University of California, Santa Cruz, CA, USA
N
N. D. Truong
School of Biomedical Engineering, The University of Sydney, Sydney, NSW, Australia
J
J. K. Eshraghian
Department of Electrical and Computer Engineering, University of California, Santa Cruz, CA, USA
Omid Kavehei
Omid Kavehei
The University of Sydney
nanoelectronicsmedical electronicsaffective computinglearning machinesintegrated circuit design