🤖 AI Summary
In critical care, modeling long-duration accelerometer data and robustly assessing patient activity remain challenging. This paper proposes MELON, an end-to-end multimodal framework featuring a novel dual-branch spectral–temporal fusion architecture: one branch leverages a pretrained Vision Transformer (ViT) to encode spectrogram-based visual representations, while the other employs a Mixture-of-Experts (MoE) sequence encoder to model long-term statistical characteristics of acceleration signals. MELON enables accurate 12-hour mobility state prediction from a single wrist-worn sensor. Evaluated on data from 126 ICU patients, it achieves an AUROC of 0.82 (95% CI: 0.78–0.86), significantly outperforming conventional methods. Notably, wrist-based deployment attains performance comparable to ankle-based alternatives, demonstrating strong clinical validity while offering advantages in lightweight design and low-cost implementation.
📝 Abstract
Patient mobility monitoring in intensive care is critical for ensuring timely interventions and improving clinical outcomes. While accelerometry-based sensor data are widely adopted in training artificial intelligence models to estimate patient mobility, existing approaches face two key limitations highlighted in clinical practice: (1) modeling the long-term accelerometer data is challenging due to the high dimensionality, variability, and noise, and (2) the absence of efficient and robust methods for long-term mobility assessment. To overcome these challenges, we introduce MELON, a novel multimodal framework designed to predict 12-hour mobility status in the critical care setting. MELON leverages the power of a dual-branch network architecture, combining the strengths of spectrogram-based visual representations and sequential accelerometer statistical features. MELON effectively captures global and fine-grained mobility patterns by integrating a pre-trained image encoder for rich frequency-domain feature extraction and a Mixture-of-Experts encoder for sequence modeling. We trained and evaluated the MELON model on the multimodal dataset of 126 patients recruited from nine Intensive Care Units at the University of Florida Health Shands Hospital main campus in Gainesville, Florida. Experiments showed that MELON outperforms conventional approaches for 12-hour mobility status estimation with an overall area under the receiver operating characteristic curve (AUROC) of 0.82 (95%, confidence interval 0.78-0.86). Notably, our experiments also revealed that accelerometer data collected from the wrist provides robust predictive performance compared with data from the ankle, suggesting a single-sensor solution that can reduce patient burden and lower deployment costs...