🤖 AI Summary
Longitudinal electronic health record (EHR) data pose significant challenges for long-term lung cancer risk prediction due to their extended temporal span and high noise levels. To address this, we propose a patient trajectory modeling framework based on a multi-agent chain. The framework comprises a collaborative worker-agent chain, a memory module (EHRMem) with temporal summarization capability, and a manager agent responsible for global reasoning—enabling key clinical event extraction, noise suppression, and interpretable long-term inference. Our approach integrates large language models, chain-of-thought reasoning, zero-shot learning, and a shared memory mechanism, requiring no fine-tuning to process five-year EHR sequences. In zero-shot one-year lung cancer risk prediction, our method significantly outperforms four baseline models, demonstrating superior robustness and strong cross-institutional generalization. This work establishes a novel paradigm for clinically trustworthy temporal modeling of EHRs.
📝 Abstract
Large language models (LLMs) offer a generalizable approach for modeling patient trajectories, but suffer from the long and noisy nature of electronic health records (EHR) data in temporal reasoning. To address these challenges, we introduce Traj-CoA, a multi-agent system involving chain-of-agents for patient trajectory modeling. Traj-CoA employs a chain of worker agents to process EHR data in manageable chunks sequentially, distilling critical events into a shared long-term memory module, EHRMem, to reduce noise and preserve a comprehensive timeline. A final manager agent synthesizes the worker agents' summary and the extracted timeline in EHRMem to make predictions. In a zero-shot one-year lung cancer risk prediction task based on five-year EHR data, Traj-CoA outperforms baselines of four categories. Analysis reveals that Traj-CoA exhibits clinically aligned temporal reasoning, establishing it as a promisingly robust and generalizable approach for modeling complex patient trajectories.