🤖 AI Summary
Existing EEG signal modeling in brain–computer interfaces (BCIs) suffers from insufficient robustness in multidimensional (temporal–spectral–spatial) representation learning and multimodal fusion. Method: This paper systematically analyzes and unifies the embedding paradigms and applicability boundaries of conventional attention and Transformer-based multi-head self-attention for EEG feature modeling. It proposes the first BCI-robust cross-modal attention fusion framework integrating CNNs, RNNs, and Transformers, incorporating three-dimensional attention (channel-, time-, and frequency-wise) alongside multimodal cross-attention mechanisms. Contribution/Results: Experiments demonstrate substantial improvements in EEG representation capacity and generalization: average classification accuracy increases by 5.2–8.7% on motor imagery and emotion recognition benchmarks, while robustness against various noise perturbations is significantly enhanced.
📝 Abstract
With the rapid advancement of deep learning, attention mechanisms have become indispensable in electroencephalography (EEG) signal analysis, significantly enhancing Brain-Computer Interface (BCI) applications. This paper presents a comprehensive review of traditional and Transformer-based attention mechanisms, their embedding strategies, and their applications in EEG-based BCI, with a particular emphasis on multimodal data fusion. By capturing EEG variations across time, frequency, and spatial channels, attention mechanisms improve feature extraction, representation learning, and model robustness. These methods can be broadly categorized into traditional attention mechanisms, which typically integrate with convolutional and recurrent networks, and Transformer-based multi-head self-attention, which excels in capturing long-range dependencies. Beyond single-modality analysis, attention mechanisms also enhance multimodal EEG applications, facilitating effective fusion between EEG and other physiological or sensory data. Finally, we discuss existing challenges and emerging trends in attention-based EEG modeling, highlighting future directions for advancing BCI technology. This review aims to provide valuable insights for researchers seeking to leverage attention mechanisms for improved EEG interpretation and application.