🤖 AI Summary
This study investigates the transfer effectiveness of general-purpose time-series foundation models on electroencephalography (EEG) downstream tasks—including motor imagery classification and sleep stage prediction—addressing the long-standing reliance on domain-specific architectures in EEG analysis. To overcome this limitation, we propose two pretraining paradigms: (1) cross-domain pretraining using large-scale, real-world multivariate time-series data from non-neurological domains, and (2) self-supervised pretraining on purely synthetic time-series data without labels or neurophysiological priors. Experimental results demonstrate that both foundation models, after lightweight fine-tuning, consistently outperform specialized baselines—including EEGNet and CBraMod—across multiple standardized EEG benchmarks. These findings establish that general-purpose time-series foundation models possess strong cross-domain transferability and can adapt efficiently to EEG analysis without requiring neural signal domain knowledge. The work introduces a scalable, low-dependency modeling paradigm for biomedical time-series analysis.
📝 Abstract
Foundation models for time series are emerging as powerful general-purpose backbones, yet their potential for domain-specific biomedical signals such as electroencephalography (EEG) remains rather unexplored. In this work, we investigate the applicability a recently proposed time series classification foundation model, to a different EEG tasks such as motor imagery classification and sleep stage prediction. We test two pretraining regimes: (a) pretraining on heterogeneous real-world time series from multiple domains, and (b) pretraining on purely synthetic data. We find that both variants yield strong performance, consistently outperforming EEGNet, a widely used convolutional baseline, and CBraMod, the most recent EEG-specific foundation model. These results suggest that generalist time series foundation models, even when pretrained on data of non-neural origin or on synthetic signals, can transfer effectively to EEG. Our findings highlight the promise of leveraging cross-domain pretrained models for brain signal analysis, suggesting that EEG may benefit from advances in the broader time series literature.