Foundation Model for Cardiac Time Series via Masked Latent Attention

📅 2026-03-27
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work proposes a self-supervised pretraining approach for electrocardiogram (ECG) foundation models based on masked latent attention, addressing the limitation of existing methods that treat individual ECG leads as independent channels and thereby neglect intrinsic structural redundancy. By introducing a latent attention mechanism, the method explicitly models high-order cross-lead interactions and enables permutation-invariant, adaptive representation aggregation, effectively leveraging the structural information inherent in ECG signals. Built upon a masked autoencoder architecture, the model is pretrained on the MIMIC-IV-ECG database and demonstrates significantly improved performance over baselines that either model leads independently or enforce explicit alignment, achieving superior representation quality and transferability on the ICD-10 diagnosis code prediction task.
📝 Abstract
Electrocardiograms (ECGs) are among the most widely available clinical signals and play a central role in cardiovascular diagnosis. While recent foundation models (FMs) have shown promise for learning transferable ECG representations, most existing pretraining approaches treat leads as independent channels and fail to explicitly leverage their strong structural redundancy. We introduce the latent attention masked autoencoder (LAMAE) FM that directly exploits this structure by learning cross-lead connection mechanisms during self-supervised pretraining. Our approach models higher-order interactions across leads through latent attention, enabling permutation-invariant aggregation and adaptive weighting of lead-specific representations. We provide empirical evidence on the Mimic-IV-ECG database that leveraging the cross-lead connection constitutes an effective form of structural supervision, improving representation quality and transferability. Our method shows strong performance in predicting ICD-10 codes, outperforming independent-lead masked modeling and alignment-based baselines.
Problem

Research questions and friction points this paper is trying to address.

electrocardiogram
foundation model
structural redundancy
cross-lead dependency
representation learning
Innovation

Methods, ideas, or system contributions that make the work stand out.

latent attention
masked autoencoder
cross-lead modeling
foundation model
ECG representation learning
🔎 Similar Papers
No similar papers found.