GluMind: Multimodal Parallel Attention and Knowledge Retention for Robust Cross-Population Blood Glucose Forecasting

📅 2025-09-22
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address two key challenges in cross-population long-term blood glucose prediction—heterogeneous signal sampling rates and catastrophic forgetting induced by continual learning—this paper proposes a Transformer-based multimodal continual learning framework. The framework introduces a cross-modal parallel attention mechanism to fuse irregularly sampled physiological and behavioral signals; incorporates multi-scale temporal attention to capture dynamic sequential patterns at varying granularities; and integrates a knowledge preservation module to effectively mitigate forgetting during incremental adaptation to new patient cohorts. Evaluated on the AIREADI dataset, the model achieves approximately 15% reduction in RMSE and 9% reduction in MAE compared to baselines, demonstrating substantial improvements in cross-population generalizability and long-term predictive stability. This work establishes a robust, sustainable modeling paradigm for deployable, personalized glycemic management.

Technology Category

Application Category

📝 Abstract
This paper proposes GluMind, a transformer-based multimodal framework designed for continual and long-term blood glucose forecasting. GluMind devises two attention mechanisms, including cross-attention and multi-scale attention, which operate in parallel and deliver accurate predictive performance. Cross-attention effectively integrates blood glucose data with other physiological and behavioral signals such as activity, stress, and heart rate, addressing challenges associated with varying sampling rates and their adverse impacts on robust prediction. Moreover, the multi-scale attention mechanism captures long-range temporal dependencies. To mitigate catastrophic forgetting, GluMind incorporates a knowledge retention technique into the transformer-based forecasting model. The knowledge retention module not only enhances the model's ability to retain prior knowledge but also boosts its overall forecasting performance. We evaluate GluMind on the recently released AIREADI dataset, which contains behavioral and physiological data collected from healthy people, individuals with prediabetes, and those with type 2 diabetes. We examine the performance stability and adaptability of GluMind in learning continuously as new patient cohorts are introduced. Experimental results show that GluMind consistently outperforms other state-of-the-art forecasting models, achieving approximately 15% and 9% improvements in root mean squared error (RMSE) and mean absolute error (MAE), respectively.
Problem

Research questions and friction points this paper is trying to address.

Forecasting blood glucose levels across diverse population groups accurately
Integrating multimodal physiological data with varying sampling rates effectively
Preventing catastrophic forgetting during continuous learning from new patient cohorts
Innovation

Methods, ideas, or system contributions that make the work stand out.

Parallel cross-attention integrates glucose with physiological signals
Multi-scale attention captures long-range temporal dependencies
Knowledge retention technique prevents catastrophic forgetting
🔎 Similar Papers
No similar papers found.