Event-driven eligibility propagation in large sparse networks: efficiency shaped by biological realism

📅 2025-11-26
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address biologically plausible, efficient, and scalable event-driven learning in large-scale sparse spiking neural networks (SNNs), this paper introduces an event-driven eligibility propagation (e-prop) rule. Our method rigorously adheres to key biological constraints—including locality, continuous-time neuronal dynamics, and sparse synaptic connectivity—by reformulating classical time-driven e-prop into an asynchronous, spike-triggered update mechanism. It further integrates recurrent network topology and energy-aware weight updates. Experiments demonstrate stable training on networks with over one million neurons; on the neuromorphic MNIST benchmark, it achieves state-of-the-art (SOTA) accuracy while reducing synaptic update overhead by ~62% and computational energy consumption by ~47%. These improvements significantly enhance learning efficiency and hardware compatibility for large-scale SNNs.

Technology Category

Application Category

📝 Abstract
Despite remarkable technological advances, AI systems may still benefit from biological principles, such as recurrent connectivity and energy-efficient mechanisms. Drawing inspiration from the brain, we present a biologically plausible extension of the eligibility propagation (e-prop) learning rule for recurrent spiking networks. By translating the time-driven update scheme into an event-driven one, we integrate the learning rule into a simulation platform for large-scale spiking neural networks and demonstrate its applicability to tasks such as neuromorphic MNIST. We extend the model with prominent biological features such as continuous dynamics and weight updates, strict locality, and sparse connectivity. Our results show that biologically grounded constraints can inform the design of computationally efficient AI algorithms, offering scalability to millions of neurons without compromising learning performance. This work bridges machine learning and computational neuroscience, paving the way for sustainable, biologically inspired AI systems while advancing our understanding of brain-like learning.
Problem

Research questions and friction points this paper is trying to address.

Develops event-driven learning for spiking neural networks
Integrates biological realism like sparse connectivity into AI
Scales algorithms to millions of neurons efficiently
Innovation

Methods, ideas, or system contributions that make the work stand out.

Event-driven e-prop learning rule for spiking networks
Integrates continuous dynamics, strict locality, sparse connectivity
Scales to millions of neurons without performance loss
🔎 Similar Papers
No similar papers found.
A
Agnes Korcsak-Gorzo
Institute for Advanced Simulation 6 (IAS-6), Jülich Research Centre, Jülich, Germany
J
Jesús A. Espinoza Valverde
Department of Mathematics and Science, University of Wuppertal, Wuppertal, Germany
J
Jonas Stapmanns
Department of Physiology, University of Bern, Bern, Switzerland
H
H. Plesser
Department of Data Science, Faculty of Science and Technology, Norwegian University of Life Sciences, Aas, Norway
David Dahmen
David Dahmen
Inst. for Neuroscience and Medicine & Inst. for Advanced Simulation, Research Centre Jülich
Computational NeuroscienceTheoretical NeurosciencePhysics
Matthias Bolten
Matthias Bolten
Department of Mathematics and Science, University of Wuppertal, Wuppertal, Germany
S
Sacha Jennifer van Albada
Institute of Zoology, University of Cologne, Cologne, Germany
Markus Diesmann
Markus Diesmann
Director, IAS-6, INM-10, Jülich Research Centre
neurosciencecomputer sciencesimulation