Sleep-Based Homeostatic Regularization for Stabilizing Spike-Timing-Dependent Plasticity in Recurrent Spiking Neural Networks

📅 2026-01-13
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In recurrent spiking neural networks (SNNs), Hebbian learning based on spike-timing-dependent plasticity (STDP) often leads to unbounded weight growth, catastrophic forgetting, and loss of representational diversity. Inspired by the synaptic homeostasis hypothesis, this work introduces—for the first time—a periodic offline “sleep” mechanism: during phases with suppressed external input, random weight decay drives the network back toward a homeostatic baseline, while spontaneous activity consolidates memory traces. This approach requires no data-dependent hyperparameter tuning and significantly enhances learning stability in STDP-based SNNs, achieving noticeable improvements with only 10–20% of the training time allocated to sleep phases on MNIST-like benchmarks. Furthermore, the study demonstrates that this mechanism is ineffective for gradient-based SNNs (e.g., surrogate-gradient SNNs), thereby delineating its applicable scope.

Technology Category

Application Category

📝 Abstract
Spike-timing-dependent plasticity (STDP) provides a biologically-plausible learning mechanism for spiking neural networks (SNNs); however, Hebbian weight updates in architectures with recurrent connections suffer from pathological weight dynamics: unbounded growth, catastrophic forgetting, and loss of representational diversity. We propose a neuromorphic regularization scheme inspired by the synaptic homeostasis hypothesis: periodic offline phases during which external inputs are suppressed, synaptic weights undergo stochastic decay toward a homeostatic baseline, and spontaneous activity enables memory consolidation. We demonstrate that this sleep-wake cycle prevents weight saturation while preserving learned structure. Empirically, we find that low to intermediate sleep durations (10-20\% of training) improve stability on MNIST-like benchmarks in our STDP-SNN model, without any data-specific hyperparameter tuning. In contrast, the same sleep intervention yields no measurable benefit for the surrogate-gradient spiking neural network (SG-SNN). Taken together, these results suggest that periodic, sleep-based renormalization may represent a fundamental mechanism for stabilizing local Hebbian learning in neuromorphic systems, while also indicating that special care is required when integrating such protocols with existing gradient-based optimization methods.
Problem

Research questions and friction points this paper is trying to address.

spike-timing-dependent plasticity
recurrent spiking neural networks
weight instability
catastrophic forgetting
synaptic homeostasis
Innovation

Methods, ideas, or system contributions that make the work stand out.

sleep-based regularization
spike-timing-dependent plasticity
synaptic homeostasis
recurrent spiking neural networks
memory consolidation
🔎 Similar Papers
No similar papers found.