Leveraging Generic Time Series Foundation Models for EEG Classification

📅 2025-10-31
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study investigates the transfer effectiveness of general-purpose time-series foundation models on electroencephalography (EEG) downstream tasks—including motor imagery classification and sleep stage prediction—addressing the long-standing reliance on domain-specific architectures in EEG analysis. To overcome this limitation, we propose two pretraining paradigms: (1) cross-domain pretraining using large-scale, real-world multivariate time-series data from non-neurological domains, and (2) self-supervised pretraining on purely synthetic time-series data without labels or neurophysiological priors. Experimental results demonstrate that both foundation models, after lightweight fine-tuning, consistently outperform specialized baselines—including EEGNet and CBraMod—across multiple standardized EEG benchmarks. These findings establish that general-purpose time-series foundation models possess strong cross-domain transferability and can adapt efficiently to EEG analysis without requiring neural signal domain knowledge. The work introduces a scalable, low-dependency modeling paradigm for biomedical time-series analysis.

Technology Category

Application Category

📝 Abstract
Foundation models for time series are emerging as powerful general-purpose backbones, yet their potential for domain-specific biomedical signals such as electroencephalography (EEG) remains rather unexplored. In this work, we investigate the applicability a recently proposed time series classification foundation model, to a different EEG tasks such as motor imagery classification and sleep stage prediction. We test two pretraining regimes: (a) pretraining on heterogeneous real-world time series from multiple domains, and (b) pretraining on purely synthetic data. We find that both variants yield strong performance, consistently outperforming EEGNet, a widely used convolutional baseline, and CBraMod, the most recent EEG-specific foundation model. These results suggest that generalist time series foundation models, even when pretrained on data of non-neural origin or on synthetic signals, can transfer effectively to EEG. Our findings highlight the promise of leveraging cross-domain pretrained models for brain signal analysis, suggesting that EEG may benefit from advances in the broader time series literature.
Problem

Research questions and friction points this paper is trying to address.

Applying time series foundation models to EEG classification tasks
Evaluating pretraining with heterogeneous real-world versus synthetic data
Assessing cross-domain transferability for brain signal analysis
Innovation

Methods, ideas, or system contributions that make the work stand out.

Using time series foundation models for EEG classification
Testing pretraining on real-world and synthetic data
Generalist models outperform EEG-specific convolutional baselines
🔎 Similar Papers
No similar papers found.