Personalized Longitudinal Medical Report Generation via Temporally-Aware Federated Adaptation

📅 2026-02-23
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenges of longitudinal medical report generation under privacy constraints and the dynamic evolution of diseases, where existing federated learning approaches often neglect temporal dependencies and patient heterogeneity, leading to model instability and suboptimal report quality. To overcome these limitations, we propose FedTAR, a novel framework that, for the first time, explicitly models the temporal dynamics of client data within a federated learning paradigm. FedTAR integrates demographic-informed personalized LoRA adapters with a first-order MAML-based temporal residual weighting aggregation mechanism, enabling high-quality report generation while preserving privacy. Evaluated on the J-MID (1 million examinations) and MIMIC-CXR datasets, our method significantly improves linguistic accuracy, temporal consistency, and cross-institutional generalization.

Technology Category

Application Category

📝 Abstract
Longitudinal medical report generation is clinically important yet remains challenging due to strict privacy constraints and the evolving nature of disease progression. Although federated learning (FL) enables collaborative training without data sharing, existing FL methods largely overlook longitudinal dynamics by assuming stationary client distributions, making them unable to model temporal shifts across visits or patient-specific heterogeneity-ultimately leading to unstable optimization and suboptimal report generation. We introduce Federated Temporal Adaptation (FTA), a federated setting that explicitly accounts for the temporal evolution of client data. Building upon this setting, we propose FedTAR, a framework that integrates demographic-driven personalization with time-aware global aggregation. FedTAR generates lightweight LoRA adapters from demographic embeddings and performs temporal residual aggregation, where updates from different visits are weighted by a meta-learned temporal policy optimized via first-order MAML. Experiments on J-MID (1M exams) and MIMIC-CXR demonstrate consistent improvements in linguistic accuracy, temporal coherence, and cross-site generalization, establishing FedTAR as a robust and privacy-preserving paradigm for federated longitudinal modeling.
Problem

Research questions and friction points this paper is trying to address.

Longitudinal medical report generation
Temporal dynamics
Patient-specific heterogeneity
Privacy constraints
Federated learning
Innovation

Methods, ideas, or system contributions that make the work stand out.

Federated Learning
Longitudinal Modeling
Temporal Adaptation
Personalized Report Generation
LoRA Adapters
🔎 Similar Papers
No similar papers found.