CausalMamba: Scalable Conditional State Space Models for Neural Causal Inference

📅 2025-10-20
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study addresses two key bottlenecks in fMRI-based neural causal inference: (1) the ill-posed inverse problem arising from hemodynamic distortion of BOLD signals, and (2) the computational intractability of methods such as Dynamic Causal Modeling (DCM). We propose a scalable two-stage framework: first, jointly deconvolving BOLD signals and estimating latent neural activity via a conditional state-space model; second, performing causal graph inference using a novel conditional Mamba architecture—enabling efficient long-range dependency modeling and capturing dynamic network reconfiguration. On synthetic data, our method achieves a 37% improvement in causal identification accuracy over DCM. Applied to real task-based fMRI, it reconstructs known neuroanatomical pathways with 88% fidelity—substantially outperforming conventional approaches, which fail in >99% of subjects. Moreover, our framework reveals, for the first time, the flexible switching of causal hubs during working memory—a fundamental mechanism of adaptive brain function.

Technology Category

Application Category

📝 Abstract
We introduce CausalMamba, a scalable framework that addresses fundamental limitations in fMRI-based causal inference: the ill-posed nature of inferring neural causality from hemodynamically distorted BOLD signals and the computational intractability of existing methods like Dynamic Causal Modeling (DCM). Our approach decomposes this complex inverse problem into two tractable stages: BOLD deconvolution to recover latent neural activity, followed by causal graph inference using a novel Conditional Mamba architecture. On simulated data, CausalMamba achieves 37% higher accuracy than DCM. Critically, when applied to real task fMRI data, our method recovers well-established neural pathways with 88% fidelity, whereas conventional approaches fail to identify these canonical circuits in over 99% of subjects. Furthermore, our network analysis of working memory data reveals that the brain strategically shifts its primary causal hub-recruiting executive or salience networks depending on the stimulus-a sophisticated reconfiguration that remains undetected by traditional methods. This work provides neuroscientists with a practical tool for large-scale causal inference that captures both fundamental circuit motifs and flexible network dynamics underlying cognitive function.
Problem

Research questions and friction points this paper is trying to address.

Inferring neural causality from hemodynamically distorted fMRI signals
Addressing computational intractability of existing causal inference methods
Capturing brain's dynamic causal network reconfigurations during cognition
Innovation

Methods, ideas, or system contributions that make the work stand out.

Decomposes causal inference into two tractable stages
Uses BOLD deconvolution to recover latent neural activity
Employs Conditional Mamba architecture for causal graph inference
🔎 Similar Papers
No similar papers found.
S
Sangyoon Bae
Interdisciplinary Program in Artificial Intelligence, Seoul National University, Seoul, 08826, South Korea
Jiook Cha
Jiook Cha
Seoul National University
Human NeuroscienceDevelopmental SciencesMachine Learning