Graph Adapter of EEG Foundation Models for Parameter Efficient Fine Tuning

📅 2024-11-25
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the high computational cost of spatiotemporal modeling and severe scarcity of labeled data in EEG-based neurological disorder diagnosis, this paper proposes EG A, a lightweight graph adapter that integrates a graph neural network (GNN) into a pre-trained temporal backbone (BENDR) while freezing the backbone parameters and optimizing only a low-rank spatial relation modeling module. This is the first work to combine GNNs with parameter-efficient fine-tuning (PEFT) for explicit EEG sensor topology modeling, enabling synergistic spatiotemporal representation learning. Evaluated on major depressive disorder (MDD) classification and TUAB abnormality detection, EG A achieves up to a 16.1% improvement in F1-score, reduces GPU memory consumption by 42%, and matches full-parameter fine-tuning performance using only 10% of annotated data—significantly enhancing feasibility for low-resource clinical deployment.

Technology Category

Application Category

📝 Abstract
In diagnosing neurological disorders from electroencephalography (EEG) data, foundation models such as Transformers have been employed to capture temporal dynamics. Additionally, Graph Neural Networks (GNNs) are critical for representing the spatial relationships among EEG sensors. However, fine-tuning these large-scale models for both temporal and spatial features can be prohibitively large in computational cost, especially under the limited availability of labeled EEG datasets. We propose EEG-GraphAdapter (EGA), a parameter-efficient fine-tuning (PEFT) approach designed to address these challenges. EGA is integrated into a pre-trained temporal backbone model as a GNN-based module, freezing the backbone and allowing only the adapter to be fine-tuned. This enables the effective acquisition of EEG spatial representations, significantly reducing computational overhead and data requirements. Experimental evaluations on two healthcare-related downstream tasks-Major Depressive Disorder (MDD) and Abnormality Detection (TUAB)-show that EGA improves performance by up to 16.1% in F1-score compared with the backbone BENDR model, highlighting its potential for scalable and accurate EEG-based predictions.
Problem

Research questions and friction points this paper is trying to address.

Efficient fine-tuning for EEG models
Reducing computational cost in EEG analysis
Enhancing spatial-temporal feature extraction in EEG
Innovation

Methods, ideas, or system contributions that make the work stand out.

Graph Adapter integration
Parameter-efficient fine-tuning
Reduced computational overhead
🔎 Similar Papers
No similar papers found.
T
Toyotaro Suzumura
The University of Tokyo
Hiroki Kanezashi
Hiroki Kanezashi
Tokyo Institute of Technology
agent-based simulationgraph analyticsdata mining
S
Shotaro Akahori
Digital Hospital, Inc.