Backpropagation through space, time, and the brain

📅 2024-03-25
🏛️ arXiv.org
📈 Citations: 8
Influential: 0
📄 PDF
🤖 AI Summary
How to achieve efficient credit assignment under spatiotemporal locality constraints in physical neural networks remains a fundamental challenge in neuromorphic computing. This paper introduces the Generalized Latent Equilibrium (GLE) framework, which for the first time couples energy minimization with neuron-level local mismatch dynamics to derive biologically plausible forward and backward continuous-time dynamics. By incorporating dendritic morphology modeling and membrane potential phase modulation, GLE implements spatiotemporal convolution and temporal reversal of feedback signals. Crucially, GLE relies exclusively on local synaptic plasticity—requiring no global timing coordination or external error broadcasting. Experiments demonstrate that, under strict locality constraints, GLE approximates the performance of backpropagation through time (BPTT), enables real-time online learning, incurs minimal memory overhead, and provides an interpretable, biologically realistic credit assignment mechanism for deep cortical networks.

Technology Category

Application Category

📝 Abstract
How physical networks of neurons, bound by spatio-temporal locality constraints, can perform efficient credit assignment, remains, to a large extent, an open question. In machine learning, the answer is almost universally given by the error backpropagation algorithm, through both space and time. However, this algorithm is well-known to rely on biologically implausible assumptions, in particular with respect to spatio-temporal (non-)locality. Alternative forward-propagation models such as real-time recurrent learning only partially solve the locality problem, but only at the cost of scaling, due to prohibitive storage requirements. We introduce Generalized Latent Equilibrium (GLE), a computational framework for fully local spatio-temporal credit assignment in physical, dynamical networks of neurons. We start by defining an energy based on neuron-local mismatches, from which we derive both neuronal dynamics via stationarity and parameter dynamics via gradient descent. The resulting dynamics can be interpreted as a real-time, biologically plausible approximation of backpropagation through space and time in deep cortical networks with continuous-time neuronal dynamics and continuously active, local synaptic plasticity. In particular, GLE exploits the morphology of dendritic trees to enable more complex information storage and processing in single neurons, as well as the ability of biological neurons to phase-shift their output rate with respect to their membrane potential, which is essential in both directions of information propagation. For the forward computation, it enables the mapping of time-continuous inputs to neuronal space, effectively performing a spatio-temporal convolution. For the backward computation, it permits the temporal inversion of feedback signals, which consequently approximate the adjoint variables necessary for useful parameter updates.
Problem

Research questions and friction points this paper is trying to address.

How physical neuron networks achieve efficient credit assignment under spatio-temporal constraints
Addressing biological implausibility of backpropagation in neural networks
Developing a local spatio-temporal credit assignment framework for dynamical neuron networks
Innovation

Methods, ideas, or system contributions that make the work stand out.

Generalized Latent Equilibrium for local credit assignment
Energy-based neuron-local mismatches drive dynamics
Dendritic morphology enables complex neuron processing
🔎 Similar Papers
No similar papers found.