Meta-Dynamical State Space Models for Integrative Neural Data Analysis

📅 2024-10-07
🏛️ arXiv.org
📈 Citations: 1
Influential: 0
📄 PDF
🤖 AI Summary
Existing neural dynamical modeling approaches rely on single-dataset training and struggle with statistical heterogeneity across recordings. To address this, we propose the Meta Dynamical State-Space Model (Meta-DSSM), the first framework to integrate meta-learning into neural dynamical modeling. Meta-DSSM parameterizes task-specific dynamical families via a low-dimensional manifold and learns a shared dynamical solution space across multi-task neural activities. It unifies variational inference, deep state-space modeling, and meta-learning to enable rapid adaptation, reconstruction, and long-horizon prediction of latent dynamics from few-shot data. Evaluated on synthetic dynamical systems and multi-arm reaching datasets from primate motor cortex, Meta-DSSM significantly improves few-shot reconstruction accuracy and trajectory prediction stability. It establishes a generalizable modeling paradigm for cross-subject and cross-session neural decoding, advancing robustness and transferability in neural dynamical inference.

Technology Category

Application Category

📝 Abstract
Learning shared structure across environments facilitates rapid learning and adaptive behavior in neural systems. This has been widely demonstrated and applied in machine learning to train models that are capable of generalizing to novel settings. However, there has been limited work exploiting the shared structure in neural activity during similar tasks for learning latent dynamics from neural recordings. Existing approaches are designed to infer dynamics from a single dataset and cannot be readily adapted to account for statistical heterogeneities across recordings. In this work, we hypothesize that similar tasks admit a corresponding family of related solutions and propose a novel approach for meta-learning this solution space from task-related neural activity of trained animals. Specifically, we capture the variabilities across recordings on a low-dimensional manifold which concisely parametrizes this family of dynamics, thereby facilitating rapid learning of latent dynamics given new recordings. We demonstrate the efficacy of our approach on few-shot reconstruction and forecasting of synthetic dynamical systems, and neural recordings from the motor cortex during different arm reaching tasks.
Problem

Research questions and friction points this paper is trying to address.

Learning shared neural dynamics across similar tasks
Overcoming statistical heterogeneities in neural recordings
Meta-learning low-dimensional manifolds for rapid adaptation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Meta-learning shared neural dynamics across tasks
Low-dimensional manifold for variability capture
Few-shot reconstruction and forecasting capability
🔎 Similar Papers
A
Ayesha Vermani
Champalimaud Centre for the Unknown, Champalimaud Foundation, Portugal
J
Josue Nassar
RyvivyR, USA
H
Hyungju Jeon
Champalimaud Centre for the Unknown, Champalimaud Foundation, Portugal
M
Matthew Dowling
Champalimaud Centre for the Unknown, Champalimaud Foundation, Portugal
I
Il Park
Champalimaud Centre for the Unknown, Champalimaud Foundation, Portugal