🤖 AI Summary
To address the challenge of deploying photoplethysmography (PPG)-based foundation models on resource-constrained wearable devices, this paper proposes PPG-Distill, a novel knowledge distillation framework. Methodologically, it jointly performs morphological distillation—preserving local waveform structures—and rhythmic distillation—capturing inter-segment temporal dependencies—to enable holistic-local collaborative knowledge transfer. Furthermore, it introduces a multi-granularity distillation mechanism operating at the prediction, feature, and segment levels. Evaluated on heart rate estimation and atrial fibrillation detection tasks, the lightweight student model achieves a 21.8% performance gain over the original large teacher model, while accelerating inference by 7× and reducing memory footprint by 19×. These improvements significantly enhance real-time physiological monitoring capabilities at the edge.
📝 Abstract
Photoplethysmography (PPG) is widely used in wearable health monitoring, yet large PPG foundation models remain difficult to deploy on resource-limited devices. We present PPG-Distill, a knowledge distillation framework that transfers both global and local knowledge through prediction-, feature-, and patch-level distillation. PPG-Distill incorporates morphology distillation to preserve local waveform patterns and rhythm distillation to capture inter-patch temporal structures. On heart rate estimation and atrial fibrillation detection, PPG-Distill improves student performance by up to 21.8% while achieving 7X faster inference and reducing memory usage by 19X, enabling efficient PPG analysis on wearables