The Patient is not a Moving Document: A World Model Training Paradigm for Longitudinal EHR

📅 2026-01-29
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the limitation of current large language models that treat electronic health records (EHR) as static text, thereby neglecting the dynamic evolution of patient states over time and in response to interventions. To overcome this, the study introduces the world model paradigm into longitudinal EHR modeling and proposes SMB-Structure, a novel framework that integrates sequence fine-tuning (SFT) with a joint embedding predictive architecture (JEPA). This approach enables the prediction of an entire patient trajectory in latent space using only an initial representation, circumventing the constraints of conventional autoregressive methods. Experiments on the MSK cohort (23,319 oncology patients) and the INSPECT cohort (19,402 pulmonary embolism patients) demonstrate that the learned representations effectively capture disease progression dynamics and significantly outperform autoregressive baselines on highly heterogeneous and complex clinical tasks.

Technology Category

Application Category

📝 Abstract
Large language models (LLMs) trained with next-word-prediction have achieved success as clinical foundation models. Representations from these language backbones yield strong linear probe performance across biomedical tasks, suggesting that patient semantics emerge from next-token prediction at scale. However, this paradigm treats patients as a document to be summarized rather than a dynamical system to be simulated; a patient's trajectory emerges from their state evolving under interventions and time, requiring models that simulate dynamics rather than predict tokens. To address this, we introduce SMB-Structure, a world model for structured EHR that grounds a joint-embedding prediction architecture (JEPA) with next-token prediction (SFT). SFT grounds our model to reconstruct future patient states in token space, while JEPA predicts those futures in latent space from the initial patient representation alone, forcing trajectory dynamics to be encoded before the next state is observed. We validate across two large-scale cohorts: Memorial Sloan Kettering (23,319 oncology patients; 323,000+ patient-years) and INSPECT (19,402 pulmonary embolism patients). Using a linear probe evaluated at multiple points along the disease trajectory, we demonstrate that our training paradigm learns embeddings that capture disease dynamics not recoverable by autoregressive baselines, enabling SMB-Structure to achieve competitive performance on complex tasks characterized by high patient heterogeneity. Model weights are available at https://huggingface.co/standardmodelbio/SMB-v1-1.7B-Structure.
Problem

Research questions and friction points this paper is trying to address.

longitudinal EHR
patient dynamics
world model
clinical foundation models
disease trajectory
Innovation

Methods, ideas, or system contributions that make the work stand out.

world model
longitudinal EHR
joint-embedding prediction architecture
disease dynamics
next-token prediction
🔎 Similar Papers
No similar papers found.
Irsyad Adam
Irsyad Adam
Medical Informatics PhD, UCLA
Knowledge GraphsGNNsMulti-Omics IntegrationMulti-Modal Fusion ModelsModel Explainability
Zekai Chen
Zekai Chen
Standard Model Biomedicine
Deep LearningLLMCVBiomedical AI
D
David Laprade
Standard Model Biomedicine
S
Shaun Porwal
Standard Model Biomedicine
D
David Laub
Standard Model Biomedicine
Erik Reinertsen
Erik Reinertsen
Standard Model Biomedicine
artificial intelligencemachine learningfoundation models
A
Arda Pekis
Standard Model Biomedicine
K
Kevin Brown
Standard Model Biomedicine