Towards Self-Supervised Foundation Models for Critical Care Time Series

📅 2025-09-24
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Foundational models for intensive care time-series analysis remain scarce, hindered by limited labeled data and poor cross-institutional generalizability. Method: This paper introduces the first self-supervised foundational model framework tailored to this domain, built upon a Bi-Axial Transformer (BAT) architecture. It leverages multi-source electronic health records for self-supervised pretraining and supports cross-dataset fine-tuning. Contribution/Results: Its core innovations include a dual-axis modeling mechanism specifically designed for high-noise, asynchronous, multivariate clinical time series, coupled with a lightweight transfer strategy. In mortality prediction, the model significantly outperforms supervised baselines under small-sample settings (<5,000 instances), demonstrating superior generalization and robustness. These results validate its feasibility for clinical deployment in resource-constrained and multi-center environments.

Technology Category

Application Category

📝 Abstract
Domain-specific foundation models for healthcare have expanded rapidly in recent years, yet foundation models for critical care time series remain relatively underexplored due to the limited size and availability of datasets. In this work, we introduce an early-stage pre-trained foundation model for critical care time-series based on the Bi-Axial Transformer (BAT), trained on pooled electronic health record datasets. We demonstrate effective transfer learning by fine-tuning the model on a dataset distinct from the training sources for mortality prediction, where it outperforms supervised baselines, particularly for small datasets ($<5,000$). These contributions highlight the potential of self-supervised foundation models for critical care times series to support generalizable and robust clinical applications in resource-limited settings.
Problem

Research questions and friction points this paper is trying to address.

Developing self-supervised foundation models for critical care time series data
Addressing limited dataset size and availability in critical care monitoring
Enabling effective transfer learning for mortality prediction with small datasets
Innovation

Methods, ideas, or system contributions that make the work stand out.

Self-supervised foundation model for critical care
Uses Bi-Axial Transformer architecture for pre-training
Fine-tuned on small datasets for mortality prediction
🔎 Similar Papers
No similar papers found.