ECHO: Toward Contextual Seq2Seq Paradigms in Large EEG Models

📅 2025-09-26
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing large EEG models (LEMs) exhibit strong pre-trained representation capabilities but lack dedicated decoders, leading to suboptimal feature utilization. Method: We propose a decoder-centric paradigm for large-scale EEG modeling, reformulating EEG analysis as a sequence-to-sequence learning task. Our framework introduces, for the first time, a discrete support sample injection mechanism to construct contextual cues, enabling parameter-free in-context learning and dynamic cross-task/cross-dataset adaptation. It explicitly models hierarchical relationships among neural signals, labels, and tasks. Contribution/Results: Extensive multi-dataset experiments demonstrate that—even with basic model components—our approach significantly outperforms state-of-the-art single-task LEMs. It achieves superior generalization and zero-shot transfer capability in multi-task settings, establishing a new foundation for adaptive, context-aware EEG intelligence.

Technology Category

Application Category

📝 Abstract
Electroencephalography (EEG), with its broad range of applications, necessitates models that can generalize effectively across various tasks and datasets. Large EEG Models (LEMs) address this by pretraining encoder-centric architectures on large-scale unlabeled data to extract universal representations. While effective, these models lack decoders of comparable capacity, limiting the full utilization of the learned features. To address this issue, we introduce ECHO, a novel decoder-centric LEM paradigm that reformulates EEG modeling as sequence-to-sequence learning. ECHO captures layered relationships among signals, labels, and tasks within sequence space, while incorporating discrete support samples to construct contextual cues. This design equips ECHO with in-context learning, enabling dynamic adaptation to heterogeneous tasks without parameter updates. Extensive experiments across multiple datasets demonstrate that, even with basic model components, ECHO consistently outperforms state-of-the-art single-task LEMs in multi-task settings, showing superior generalization and adaptability.
Problem

Research questions and friction points this paper is trying to address.

Developing decoder-centric models for EEG sequence learning
Enabling dynamic adaptation to heterogeneous EEG tasks
Improving generalization across multi-task EEG datasets
Innovation

Methods, ideas, or system contributions that make the work stand out.

Decoder-centric sequence-to-sequence EEG modeling paradigm
Contextual cues from discrete support samples integration
In-context learning enables dynamic task adaptation
C
Chenyu Liu
College of Computing and Data Science, Nanyang Technological University, Singapore
Y
Yuqiu Deng
College of Computing and Data Science, Nanyang Technological University, Singapore
T
Tianyu Liu
School of Mechanical Engineering, Xi’an Jiaotong University, Xi’an, China
J
Jinan Zhou
Nutanix, CA, USA
Xinliang Zhou
Xinliang Zhou
Nanyang Technological University, Singapore
Brain Computer InterfacesFoundation ModelsxAI
Z
Ziyu Jia
Institute of Automation, Chinese Academy of Sciences, Beijing, China
Y
Yi Ding
College of Computing and Data Science, Nanyang Technological University, Singapore