Non-Contact Health Monitoring During Daily Personal Care Routines

📅 2025-06-11
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the poor robustness of remote photoplethysmography (rPPG) physiological monitoring under challenging real-world conditions—such as drastic illumination variations (e.g., high-altitude environments), frequent facial occlusions, and dynamic head poses—this paper proposes a non-contact, long-term health monitoring framework tailored for daily personal care scenarios (e.g., mirror-based grooming). Methodologically, we introduce LADH, the first long-duration, multimodal rPPG dataset featuring synchronized RGB+IR video streams alongside ground-truth PPG, respiration, and blood oxygen signals. We further propose the first RGB–IR cross-modal fusion strategy integrated with multi-task deep learning, coupled with a dynamic facial motion-robust feature extraction mechanism. Experiments demonstrate state-of-the-art performance: a mean absolute error of only 4.99 BPM in heart rate estimation, significantly outperforming existing methods under strong illumination interference, hand-induced occlusions, and large-pose variations. Both code and the LADH dataset are publicly released.

Technology Category

Application Category

📝 Abstract
Remote photoplethysmography (rPPG) enables non-contact, continuous monitoring of physiological signals and offers a practical alternative to traditional health sensing methods. Although rPPG is promising for daily health monitoring, its application in long-term personal care scenarios, such as mirror-facing routines in high-altitude environments, remains challenging due to ambient lighting variations, frequent occlusions from hand movements, and dynamic facial postures. To address these challenges, we present LADH (Long-term Altitude Daily Health), the first long-term rPPG dataset containing 240 synchronized RGB and infrared (IR) facial videos from 21 participants across five common personal care scenarios, along with ground-truth PPG, respiration, and blood oxygen signals. Our experiments demonstrate that combining RGB and IR video inputs improves the accuracy and robustness of non-contact physiological monitoring, achieving a mean absolute error (MAE) of 4.99 BPM in heart rate estimation. Furthermore, we find that multi-task learning enhances performance across multiple physiological indicators simultaneously. Dataset and code are open at https://github.com/McJackTang/FusionVitals.
Problem

Research questions and friction points this paper is trying to address.

Address ambient lighting and occlusion challenges in rPPG monitoring
Improve accuracy of non-contact physiological signal measurement
Enable multi-task learning for diverse health indicators
Innovation

Methods, ideas, or system contributions that make the work stand out.

Combines RGB and IR video for robust monitoring
Introduces multi-task learning for multiple indicators
Provides first long-term rPPG dataset LADH
🔎 Similar Papers
No similar papers found.
X
Xulin Ma
Department of Computer Technology and Applications, Qinghai University, Qinghai, 810016
Jiankai Tang
Jiankai Tang
Tsinghua University
DesignUbiquitous ComputingPhysiological Sensing
Zhang Jiang
Zhang Jiang
Physicist at Argonne National Laboratory
film and surfaceGISAXSXPCScoherent imagingspeckles
S
Songqin Cheng
National Key Laboratory of Human Factors Engineering, Beijing, 100094
Yuanchun Shi
Yuanchun Shi
Professor
human computer interaction
L
LI Dong
Department of Computer Technology and Applications, Qinghai University, Qinghai, 810016
X
Xin Liu
Paul G. Allen School of Computer Science & Engineering, University of Washington, Seattle WA, 352350
Daniel McDuff
Daniel McDuff
Google and University of Washington
Affective ComputingDeep LearningHuman-Computer InteractionHuman-Centered AIComputer Vision
X
Xiaojing Liu
Department of Computer Technology and Applications, Qinghai University, Qinghai, 810016
Yuntao Wang
Yuntao Wang
Tsinghua University
Human-Computer InteractionUbiquitous ComputingPhysio-Behavioral Computing