🤖 AI Summary
This study addresses the lack of systematic identification and multimodal integration of key neural regions and their gamma–alpha electroencephalographic (EEG) oscillatory patterns during deep cognitive attention. To this end, we propose Gamma2Patterns, a novel framework that, for the first time, jointly models gamma burst dynamics with eye-tracking behaviors—including fixations, saccades, and pupillary signals—using 62-channel EEG data from the SEED-IV dataset. By integrating spectral power and temporal features, our multimodal approach reveals that gamma-band power and burst duration exhibit greater discriminability than alpha-band measures. We successfully localize core brain regions governing deep attention, particularly the frontopolar and temporal cortices, and map their cortical oscillatory signatures, thereby providing critical neurophysiological evidence for brain-inspired attention mechanisms.
📝 Abstract
Deep cognitive attention is characterized by heightened gamma oscillations and coordinated visual behavior. Despite the physiological importance of these mechanisms, computational studies rarely synthesize these modalities or identify the neural regions most responsible for sustained focus. To address this gap, this work introduces Gamma2Patterns, a multimodal framework that characterizes deep cognitive attention by leveraging complementary Gamma and Alpha band EEG activity alongside Eye-tracking measurements. Using the SEED-IV dataset [1], we extract spectral power, burst-based temporal dynamics, and fixation-saccade-pupil signals across 62 channels or electrodes to analyze how neural activation differs between high-focus (Gamma-dominant) and low-focus (Alpha-dominant) states. Our findings reveal that frontopolar, temporal, anterior frontal, and parieto-occipital regions exhibit the strongest Gamma power and burst rates, indicating their dominant role in deep attentional engagement, while Eye-tracking signals confirm complementary contributions from frontal, frontopolar, and frontotemporal regions. Furthermore, we show that Gamma power and burst duration provide more discriminative markers of deep focus than Alpha power alone, demonstrating their value for attention decoding. Collectively, these results establish a multimodal, evidence-based map of cortical regions and oscillatory signatures underlying deep focus, providing a neurophysiological foundation for future brain-inspired attention mechanisms in AI systems.