🤖 AI Summary
Traditional symmetric recurrent neural networks struggle to capture the non-equilibrium, time-varying, and chaotic dynamics driven by asymmetric connectivity in biological neural circuits. To address this limitation, this work proposes a drift-diffusion matching framework that embeds the drift and diffusion terms of arbitrary stochastic dynamical systems precisely into a low-dimensional latent manifold using asymmetric continuous-time RNNs. This approach overcomes the constraint of attractor networks being confined to equilibrium states and, for the first time, unifies non-equilibrium statistical mechanics with associative and sequential memory within a dynamic embedding framework based on asymmetric RNNs. The resulting model accurately reproduces complex stochastic dynamics and supports transient switching among multiple attractors—driven either by external inputs or by intrinsic non-equilibrium flows.
📝 Abstract
Recurrent neural networks (RNNs) provide a theoretical framework for understanding computation in biological neural circuits, yet classical results, such as Hopfield's model of associative memory, rely on symmetric connectivity that restricts network dynamics to gradient-like flows. In contrast, biological networks support rich time-dependent behaviour facilitated by their asymmetry. Here we introduce a general framework, which we term drift-diffusion matching, for training continuous-time RNNs to represent arbitrary stochastic dynamical systems within a low-dimensional latent subspace. Allowing asymmetric connectivity, we show that RNNs can faithfully embed the drift and diffusion of a given stochastic differential equation, including nonlinear and nonequilibrium dynamics such as chaotic attractors. As an application, we construct RNN realisations of stochastic systems that transiently explore various attractors through both input-driven switching and autonomous transitions driven by nonequilibrium currents, which we interpret as models of associative and sequential (episodic) memory. To elucidate how these dynamics are encoded in the network, we introduce decompositions of the RNN based on its asymmetric connectivity and its time-irreversibility. Our results extend attractor neural network theory beyond equilibrium, showing that asymmetric neural populations can implement a broad class of dynamical computations within low-dimensional manifolds, unifying ideas from associative memory, nonequilibrium statistical mechanics, and neural computation.