🤖 AI Summary
This paper identifies a pervasive “driver-blindness” problem in deep learning sequence models for blood glucose prediction: models over-rely on glucose autoregression while neglecting critical clinical drivers such as insulin administration, dietary intake, and physical activity. To address this, we propose Δ_drivers—a novel, quantifiable metric that formally evaluates a model’s utilization of multivariate driver information. We identify three root causes: architectural biases favoring temporal autocorrelation, data distortions (e.g., irregular sampling, missingness), and physiological heterogeneity across individuals. Accordingly, we design a physiology-aware modeling framework integrating causal regularization, physiologically grounded feature encoders, and personalized modeling. Experiments show mainstream models exhibit Δ_drivers ≈ 0, confirming severe underutilization of driver signals; incorporating physiological mechanisms significantly increases Δ_drivers, enhancing both predictive accuracy and clinical interpretability. Our work advances glucose forecasting toward mechanism-informed, clinically trustworthy decision support.
📝 Abstract
Deep sequence models for blood glucose forecasting consistently fail to leverage clinically informative drivers--insulin, meals, and activity--despite well-understood physiological mechanisms. We term this Driver-Blindness and formalize it via $Delta_{ ext{drivers}}$, the performance gain of multivariate models over matched univariate baselines. Across the literature, $Delta_{ ext{drivers}}$ is typically near zero. We attribute this to three interacting factors: architectural biases favoring autocorrelation (C1), data fidelity gaps that render drivers noisy and confounded (C2), and physiological heterogeneity that undermines population-level models (C3). We synthesize strategies that partially mitigate Driver-Blindness--including physiological feature encoders, causal regularization, and personalization--and recommend that future work routinely report $Delta_{ ext{drivers}}$ to prevent driver-blind models from being considered state-of-the-art.