Langevin Flows for Modeling Neural Latent Dynamics

📅 2025-07-15
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of modeling latent dynamical structures in neural populations driven jointly by intrinsic dynamics and unobserved external inputs. To this end, we propose a sequential variational autoencoder grounded in an underdamped Langevin equation: physical priors—namely inertia, damping, and stochastic perturbations—are explicitly encoded in the latent dynamics, while the potential function is parameterized by a locally coupled oscillator network to generate biologically plausible oscillatory and streaming activity. The architecture integrates a recurrent encoder, a single-layer Transformer decoder, and latent-space stochastic differential equation evolution. On synthetic Lorenz data, the model accurately recovers ground-truth firing rates. It achieves state-of-the-art held-out neuron log-likelihood and forward prediction accuracy across all four Neural Latents Benchmark (NLB) neural recording datasets. Furthermore, decoded hand movement velocity matches or surpasses current methods in performance.

Technology Category

Application Category

📝 Abstract
Neural populations exhibit latent dynamical structures that drive time-evolving spiking activities, motivating the search for models that capture both intrinsic network dynamics and external unobserved influences. In this work, we introduce LangevinFlow, a sequential Variational Auto-Encoder where the time evolution of latent variables is governed by the underdamped Langevin equation. Our approach incorporates physical priors -- such as inertia, damping, a learned potential function, and stochastic forces -- to represent both autonomous and non-autonomous processes in neural systems. Crucially, the potential function is parameterized as a network of locally coupled oscillators, biasing the model toward oscillatory and flow-like behaviors observed in biological neural populations. Our model features a recurrent encoder, a one-layer Transformer decoder, and Langevin dynamics in the latent space. Empirically, our method outperforms state-of-the-art baselines on synthetic neural populations generated by a Lorenz attractor, closely matching ground-truth firing rates. On the Neural Latents Benchmark (NLB), the model achieves superior held-out neuron likelihoods (bits per spike) and forward prediction accuracy across four challenging datasets. It also matches or surpasses alternative methods in decoding behavioral metrics such as hand velocity. Overall, this work introduces a flexible, physics-inspired, high-performing framework for modeling complex neural population dynamics and their unobserved influences.
Problem

Research questions and friction points this paper is trying to address.

Model latent neural dynamics and external influences
Incorporate physical priors for autonomous and non-autonomous processes
Improve neural activity prediction and behavioral decoding accuracy
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses underdamped Langevin equation for latent dynamics
Parameterizes potential function with coupled oscillators
Combines recurrent encoder and Transformer decoder
🔎 Similar Papers
No similar papers found.