Neural Encoding and Decoding at Scale

📅 2025-04-11
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing neural encoding/decoding models support only unidirectional mapping (behavior→neural or neural→behavior), failing to capture the bidirectional causal relationships between neural activity and behavior. Method: We propose the first large-scale, multi-animal, multi-task neural-behavioral bidirectional modeling framework, built upon a multimodal Transformer architecture. It introduces a novel joint masking pretraining strategy across neural, behavioral, and cross-modal modalities, enabling unified encoding and decoding. The model is pretrained at scale on the International Brain Laboratory (IBL) repeated-sites dataset and supports cross-animal transfer fine-tuning—without requiring explicit brain-region annotations. Contribution/Results: Pretrained embeddings spontaneously acquire brain-region prediction capability. In the multi-animal pretraining + novel-animal fine-tuning paradigm, our model achieves state-of-the-art performance on both encoding and decoding tasks, demonstrating strong potential as a general-purpose foundational model for brain-behavior research.

Technology Category

Application Category

📝 Abstract
Recent work has demonstrated that large-scale, multi-animal models are powerful tools for characterizing the relationship between neural activity and behavior. Current large-scale approaches, however, focus exclusively on either predicting neural activity from behavior (encoding) or predicting behavior from neural activity (decoding), limiting their ability to capture the bidirectional relationship between neural activity and behavior. To bridge this gap, we introduce a multimodal, multi-task model that enables simultaneous Neural Encoding and Decoding at Scale (NEDS). Central to our approach is a novel multi-task-masking strategy, which alternates between neural, behavioral, within-modality, and cross-modality masking. We pretrain our method on the International Brain Laboratory (IBL) repeated site dataset, which includes recordings from 83 animals performing the same visual decision-making task. In comparison to other large-scale models, we demonstrate that NEDS achieves state-of-the-art performance for both encoding and decoding when pretrained on multi-animal data and then fine-tuned on new animals. Surprisingly, NEDS's learned embeddings exhibit emergent properties: even without explicit training, they are highly predictive of the brain regions in each recording. Altogether, our approach is a step towards a foundation model of the brain that enables seamless translation between neural activity and behavior.
Problem

Research questions and friction points this paper is trying to address.

Bidirectional neural-behavior relationship modeling
Simultaneous neural encoding and decoding
Multi-animal neural data generalization
Innovation

Methods, ideas, or system contributions that make the work stand out.

Multimodal multi-task model for neural-behavioral translation
Novel multi-task-masking strategy for diverse data
Pretraining on multi-animal data enhances performance
🔎 Similar Papers
No similar papers found.
Y
Yizi Zhang
Columbia University
Y
Yanchen Wang
Columbia University
Mehdi Azabou
Mehdi Azabou
Columbia University
Machine LearningComputational Neuroscience
A
Alexandre Andre
University of Pennsylvania
Z
Zixuan Wang
Columbia University
H
Hanrui Lyu
Northwestern University
T
The International Brain Laboratory
The International Brain Laboratory
Eva L. Dyer
Eva L. Dyer
University of Pennsylvania, CIFAR
Computational NeuroscienceMachine LearningSignal ProcessingSelf-Supervised Learning
Liam Paninski
Liam Paninski
Columbia University
Neural data science
Cole Hurwitz
Cole Hurwitz
Postdoctoral Research Scientist, Zuckerman Institute, Columbia University
Foundation modelsNeural Data AnalysisSpike sorting