🤖 AI Summary
To address two key challenges in cross-population long-term blood glucose prediction—heterogeneous signal sampling rates and catastrophic forgetting induced by continual learning—this paper proposes a Transformer-based multimodal continual learning framework. The framework introduces a cross-modal parallel attention mechanism to fuse irregularly sampled physiological and behavioral signals; incorporates multi-scale temporal attention to capture dynamic sequential patterns at varying granularities; and integrates a knowledge preservation module to effectively mitigate forgetting during incremental adaptation to new patient cohorts. Evaluated on the AIREADI dataset, the model achieves approximately 15% reduction in RMSE and 9% reduction in MAE compared to baselines, demonstrating substantial improvements in cross-population generalizability and long-term predictive stability. This work establishes a robust, sustainable modeling paradigm for deployable, personalized glycemic management.
📝 Abstract
This paper proposes GluMind, a transformer-based multimodal framework designed for continual and long-term blood glucose forecasting. GluMind devises two attention mechanisms, including cross-attention and multi-scale attention, which operate in parallel and deliver accurate predictive performance. Cross-attention effectively integrates blood glucose data with other physiological and behavioral signals such as activity, stress, and heart rate, addressing challenges associated with varying sampling rates and their adverse impacts on robust prediction. Moreover, the multi-scale attention mechanism captures long-range temporal dependencies. To mitigate catastrophic forgetting, GluMind incorporates a knowledge retention technique into the transformer-based forecasting model. The knowledge retention module not only enhances the model's ability to retain prior knowledge but also boosts its overall forecasting performance. We evaluate GluMind on the recently released AIREADI dataset, which contains behavioral and physiological data collected from healthy people, individuals with prediabetes, and those with type 2 diabetes. We examine the performance stability and adaptability of GluMind in learning continuously as new patient cohorts are introduced. Experimental results show that GluMind consistently outperforms other state-of-the-art forecasting models, achieving approximately 15% and 9% improvements in root mean squared error (RMSE) and mean absolute error (MAE), respectively.