ECG-FM: An Open Electrocardiogram Foundation Model

📅 2024-08-09
🏛️ arXiv.org
📈 Citations: 19
Influential: 7
📄 PDF
🤖 AI Summary
To address the challenges of electrocardiogram (ECG) analysis models—namely, heavy reliance on large-scale labeled data, poor generalizability, and scarcity of publicly available foundational models—this work introduces ECG-BM, the first open-source ECG foundation model. Methodologically, we propose a novel self-supervised pretraining framework integrating continuous physiological signal masked modeling with contrastive learning, augmented by ECG-specific data augmentation, trained on over one million multi-center real-world ECG signals. Built upon a Transformer architecture, ECG-BM jointly optimizes three objectives: signal reconstruction, local–global contrastive alignment, and temporal consistency. Extensive experiments demonstrate that ECG-BM significantly outperforms supervised baselines across diverse downstream tasks—including ECG interpretation classification, prediction of left ventricular ejection fraction (LVEF) reduction, and detection of troponin abnormalities. The learned representations exhibit both high discriminability and clinical interpretability, thereby establishing a general-purpose representation learning paradigm for electrophysiology.

Technology Category

Application Category

📝 Abstract
The electrocardiogram (ECG) is a ubiquitous diagnostic test. Conventional task-specific ECG analysis models require large numbers of expensive ECG annotations or associated labels to train. Transfer learning techniques have been shown to improve generalization and reduce reliance on labeled data. We present ECG-FM, an open foundation model for ECG analysis, and conduct a comprehensive study performed on a dataset of 1.66 million ECGs sourced from both publicly available and private institutional sources. ECG-FM adopts a transformer-based architecture and is pretrained on 2.5 million samples using ECG-specific augmentations and contrastive learning, as well as a continuous signal masking objective. Our transparent evaluation includes a diverse range of downstream tasks, where we predict ECG interpretation labels, reduced left ventricular ejection fraction, and abnormal cardiac troponin. Affirming ECG-FM's effectiveness as a foundation model, we demonstrate how its command of contextual information results in strong performance, rich pretrained embeddings, and reliable interpretability. Due to a lack of open-weight practices, we highlight how ECG analysis is lagging behind other medical machine learning subfields in terms of foundation model adoption. Our code is available at https://github.com/bowang-lab/ECG-FM/.
Problem

Research questions and friction points this paper is trying to address.

Develops open ECG foundation model for label-efficient analysis
Addresses scarcity of open-weight ECG models for cross-study comparability
Improves performance on clinical tasks like atrial fibrillation detection
Innovation

Methods, ideas, or system contributions that make the work stand out.

Transformer-based model for ECG analysis
Hybrid contrastive and generative self-supervised learning
Open-weight foundation model with cross-dataset generalizability
🔎 Similar Papers
No similar papers found.
K
Kaden McKeen
University Health Network, Toronto, Canada; Peter Munk Cardiac Centre, University Health Network, Toronto, Canada; AI Hub, University Health Network, Toronto, Canada; Vector Institute for Artificial Intelligence, Toronto, Canada; Department of Laboratory Medicine and Pathobiology, University of Toronto, Toronto, Canada
L
Laura Oliva
Peter Munk Cardiac Centre, University Health Network, Toronto, Canada
S
Sameer Masood
Department of Medicine, University of Toronto, Toronto, Canada
A
Augustin Toma
Department of Medical Biophysics, University of Toronto, Toronto, Canada
B
Barry Rubin
University Health Network, Toronto, Canada; Peter Munk Cardiac Centre, University Health Network, Toronto, Canada
B
Bo Wang
University Health Network, Toronto, Canada; Peter Munk Cardiac Centre, University Health Network, Toronto, Canada; AI Hub, University Health Network, Toronto, Canada; Vector Institute for Artificial Intelligence, Toronto, Canada; Department of Laboratory Medicine and Pathobiology, University of Toronto, Toronto, Canada; Department of Medical Biophysics, University of Toronto, Toronto, Canada; Department of Computer Science, University of Toronto, Toronto, Canada