🤖 AI Summary
To address biologically plausible, efficient, and scalable event-driven learning in large-scale sparse spiking neural networks (SNNs), this paper introduces an event-driven eligibility propagation (e-prop) rule. Our method rigorously adheres to key biological constraints—including locality, continuous-time neuronal dynamics, and sparse synaptic connectivity—by reformulating classical time-driven e-prop into an asynchronous, spike-triggered update mechanism. It further integrates recurrent network topology and energy-aware weight updates. Experiments demonstrate stable training on networks with over one million neurons; on the neuromorphic MNIST benchmark, it achieves state-of-the-art (SOTA) accuracy while reducing synaptic update overhead by ~62% and computational energy consumption by ~47%. These improvements significantly enhance learning efficiency and hardware compatibility for large-scale SNNs.
📝 Abstract
Despite remarkable technological advances, AI systems may still benefit from biological principles, such as recurrent connectivity and energy-efficient mechanisms. Drawing inspiration from the brain, we present a biologically plausible extension of the eligibility propagation (e-prop) learning rule for recurrent spiking networks. By translating the time-driven update scheme into an event-driven one, we integrate the learning rule into a simulation platform for large-scale spiking neural networks and demonstrate its applicability to tasks such as neuromorphic MNIST. We extend the model with prominent biological features such as continuous dynamics and weight updates, strict locality, and sparse connectivity. Our results show that biologically grounded constraints can inform the design of computationally efficient AI algorithms, offering scalability to millions of neurons without compromising learning performance. This work bridges machine learning and computational neuroscience, paving the way for sustainable, biologically inspired AI systems while advancing our understanding of brain-like learning.