Temporal Entailment Pretraining for Clinical Language Models over EHR Data

📅 2025-04-25
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Current clinical language models treat electronic health records (EHRs) as static text, neglecting their inherent temporal evolution and causal structure. To address this limitation, we propose a **temporal entailment pretraining objective**, the first to formalize EHRs as ordered sentence pairs and task the model with classifying the logical relationship—entailment, contradiction, or neutrality—between sequential clinical states. This explicitly fosters temporal reasoning over patient trajectories. We construct a temporally aligned sentence-pair dataset from MIMIC-IV and jointly optimize via contrastive learning and three-way classification, fully compatible with standard Transformer architectures. Our method achieves state-of-the-art performance on three distinct temporal clinical tasks: temporal clinical question answering, early warning prediction, and disease progression modeling. It significantly improves cross-task generalization and overcomes the fundamental constraints of static-text modeling paradigms.

Technology Category

Application Category

📝 Abstract
Clinical language models have achieved strong performance on downstream tasks by pretraining on domain specific corpora such as discharge summaries and medical notes. However, most approaches treat the electronic health record as a static document, neglecting the temporally-evolving and causally entwined nature of patient trajectories. In this paper, we introduce a novel temporal entailment pretraining objective for language models in the clinical domain. Our method formulates EHR segments as temporally ordered sentence pairs and trains the model to determine whether a later state is entailed by, contradictory to, or neutral with respect to an earlier state. Through this temporally structured pretraining task, models learn to perform latent clinical reasoning over time, improving their ability to generalize across forecasting and diagnosis tasks. We pretrain on a large corpus derived from MIMIC IV and demonstrate state of the art results on temporal clinical QA, early warning prediction, and disease progression modeling.
Problem

Research questions and friction points this paper is trying to address.

Addresses neglect of temporal evolution in EHR data modeling
Introduces temporal entailment pretraining for clinical reasoning
Improves performance on forecasting and diagnosis tasks
Innovation

Methods, ideas, or system contributions that make the work stand out.

Temporal entailment pretraining for clinical language models
Formulates EHR segments as temporally ordered sentence pairs
Trains models to determine entailment over patient states
🔎 Similar Papers
No similar papers found.