🤖 AI Summary
This work investigates the exact temporal memory and autonomous replay of arbitrary stochastic spike sequences by continuous-time recurrent neural networks (CTRNNs). Addressing challenges in relative timing stability and associative recall under noise, we propose an offline weight design method grounded in numerical experiments, where template-constrained optimization explicitly enforces temporal dynamical stability. We provide the first theoretical and experimental validation that, within a well-defined parameter regime, CTRNNs can stably reproduce the precise relative timing of any given spike sequence with probability approaching one, while enabling robust, noise-tolerant associative recall. Experiments demonstrate high-fidelity temporal memory and replay, maintaining superior associative performance across diverse noise conditions. Our approach establishes a provably stable modeling framework for spike-timing-based neural coding—offering a novel paradigm for rigorous analysis of temporal sequence processing in spiking neural systems.
📝 Abstract
The paper explores the capability of continuous-time recurrent neural networks to store and recall precisely timed scores of spike trains. We show (by numerical experiments) that this is indeed possible: within some range of parameters, any random score of spike trains (for all neurons in the network) can be robustly memorized and autonomously reproduced with stable accurate relative timing of all spikes, with probability close to one. We also demonstrate associative recall under noisy conditions. In these experiments, the required synaptic weights are computed offline, to satisfy a template that encourages temporal stability.