A Variational Latent Equilibrium for Learning in Cortex

📅 2026-03-10
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of efficiently learning complex spatiotemporal patterns under biologically plausible constraints, offering an alternative to backpropagation through time (BPTT), which is inconsistent with known brain mechanisms. The authors propose a variational latent equilibrium framework grounded in energy conservation and the principle of least action. By constructing a prospective energy function, they derive real-time error dynamics in continuous-time neural networks and translate these into fully local synaptic and neuronal update rules. This approach unifies and extends existing local, continuous-time, phase-free credit assignment mechanisms, achieving learning performance equivalent to BPTT while adhering to biological plausibility. The framework thus provides a rigorous theoretical foundation and a concrete design blueprint for brain-inspired computing and neuromorphic hardware.

Technology Category

Application Category

📝 Abstract
Brains remain unrivaled in their ability to recognize and generate complex spatiotemporal patterns. While AI is able to reproduce some of these capabilities, deep learning algorithms remain largely at odds with our current understanding of brain circuitry and dynamics. This is prominently the case for backpropagation through time (BPTT), the go-to algorithm for learning complex temporal dependencies. In this work we propose a general formalism to approximate BPTT in a controlled, biologically plausible manner. Our approach builds on, unifies and extends several previous approaches to local, time-continuous, phase-free spatiotemporal credit assignment based on principles of energy conservation and extremal action. Our starting point is a prospective energy function of neuronal states, from which we calculate real-time error dynamics for time-continuous neuronal networks. In the general case, this provides a simple and straightforward derivation of the adjoint method result for neuronal networks, the time-continuous equivalent to BPTT. With a few modifications, we can turn this into a fully local (in space and time) set of equations for neuron and synapse dynamics. Our theory provides a rigorous framework for spatiotemporal deep learning in the brain, while simultaneously suggesting a blueprint for physical circuits capable of carrying out these computations. These results reframe and extend the recently proposed Generalized Latent Equilibrium (GLE) model.
Problem

Research questions and friction points this paper is trying to address.

biologically plausible learning
spatiotemporal credit assignment
backpropagation through time
neuronal dynamics
energy-based models
Innovation

Methods, ideas, or system contributions that make the work stand out.

Variational Latent Equilibrium
biologically plausible learning
continuous-time credit assignment
energy-based neural dynamics
local learning rules
🔎 Similar Papers