Resting-state fMRI Analysis using Quantum Time-series Transformer

📅 2025-08-31
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Traditional Transformers applied to resting-state fMRI face prohibitive O(N²) computational complexity, excessive parameter counts, and high sample requirements—limiting their utility in brain network modeling and biomarker discovery for neuropsychiatric disorders. To address these challenges, we propose the Quantum Temporal Series Transformer (QTS-Transformer), the first fMRI analysis framework integrating Quantum Linear Combination of Unitaries (LCU) and Quantum Singular Value Transformation (QSVT). This enables polylogarithmic-time computation, drastically reducing parameter count and alleviating small-sample dependency. Coupled with SHAP-based interpretability, QTS-Transformer achieves performance on par with or exceeding classical Transformers on the ABCD and UK Biobank datasets—particularly demonstrating robustness under limited training samples. Moreover, it successfully identifies ADHD-specific functional connectivity biomarkers, advancing both computational efficiency and clinical interpretability in neuroimaging analytics.

Technology Category

Application Category

📝 Abstract
Resting-state functional magnetic resonance imaging (fMRI) has emerged as a pivotal tool for revealing intrinsic brain network connectivity and identifying neural biomarkers of neuropsychiatric conditions. However, classical self-attention transformer models--despite their formidable representational power--struggle with quadratic complexity, large parameter counts, and substantial data requirements. To address these barriers, we introduce a Quantum Time-series Transformer, a novel quantum-enhanced transformer architecture leveraging Linear Combination of Unitaries and Quantum Singular Value Transformation. Unlike classical transformers, Quantum Time-series Transformer operates with polylogarithmic computational complexity, markedly reducing training overhead and enabling robust performance even with fewer parameters and limited sample sizes. Empirical evaluation on the largest-scale fMRI datasets from the Adolescent Brain Cognitive Development Study and the UK Biobank demonstrates that Quantum Time-series Transformer achieves comparable or superior predictive performance compared to state-of-the-art classical transformer models, with especially pronounced gains in small-sample scenarios. Interpretability analyses using SHapley Additive exPlanations further reveal that Quantum Time-series Transformer reliably identifies clinically meaningful neural biomarkers of attention-deficit/hyperactivity disorder (ADHD). These findings underscore the promise of quantum-enhanced transformers in advancing computational neuroscience by more efficiently modeling complex spatio-temporal dynamics and improving clinical interpretability.
Problem

Research questions and friction points this paper is trying to address.

Reduces quadratic complexity in fMRI transformers
Enables robust performance with limited data
Identifies clinically meaningful neural biomarkers
Innovation

Methods, ideas, or system contributions that make the work stand out.

Quantum-enhanced transformer with polylogarithmic complexity
Leverages Linear Combination of Unitaries
Uses Quantum Singular Value Transformation
🔎 Similar Papers
No similar papers found.
J
Junghoon Justin Park
Interdisciplinary Program in Artificial Intelligence, Seoul National University
J
Jungwoo Seo
Department of Brain and Cognitive Sciences, Seoul National University
S
Sangyoon Bae
Interdisciplinary Program in Artificial Intelligence, Seoul National University
Samuel Yen-Chi Chen
Samuel Yen-Chi Chen
Wells Fargo
quantum computationquantum informationmachine learningquantum machine learning
Huan-Hsin Tseng
Huan-Hsin Tseng
Brookhaven National Laboratory
Quantum ComputingMachine LearningMathematical PhysicsGeneral RelativityGauge Theories
Jiook Cha
Jiook Cha
Seoul National University
Human NeuroscienceDevelopmental SciencesMachine Learning
Shinjae Yoo
Shinjae Yoo
Brookhaven National Lab
Machine Learning