A Nutrition Multimodal Photoplethysmography Language Model

📅 2025-11-24
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study addresses the challenge of non-invasive, large-scale, dynamic monitoring of hunger/satiety states to enhance dietary behavior tracking and metabolic health assessment. We propose NutriMLM, the first nutritional multimodal language model that maps photoplethysmography (PPG) signals—collected via wearables—into language-model-compatible embeddings, enabling cross-modal joint reasoning between physiological signals and dietary text for the first time. Our method integrates self-supervised representation learning with sequence modeling, trained on a large-scale PPG–meal paired dataset comprising 19,340 individuals and 1.1 million meals. The model demonstrates robust performance even under text-absent conditions, improving caloric intake prediction accuracy by 11% over text-only baselines and maintaining consistent performance on independent validation sets. The core contribution is the pioneering construction of a PPG-to-language cross-modal interface, establishing a novel paradigm for non-invasive, continuous nutritional sensing.

Technology Category

Application Category

📝 Abstract
Hunger and satiety dynamics shape dietary behaviors and metabolic health, yet remain difficult to capture in everyday settings. We present a Nutrition Photoplethysmography Language Model (NPLM), integrating continuous photoplethysmography (PPG) from wearables with meal descriptions. NPLM projects PPG into embeddings interpretable by language models, enabling joint reasoning over physiology and meal context. Trained on 19,340 participants and 1.1 million meal-PPG pairs, the model improved daily caloric intake prediction by 11% over text-only baselines, with accuracy maintained when 80% of meal text was removed. In an independent validation study (n=140) with controlled dining and detailed meal information, the model replicated these findings. These results demonstrate the value of integrating physiological measurements from consumer wearables with meal information for noninvasive dietary monitoring at scale.
Problem

Research questions and friction points this paper is trying to address.

Capturing hunger and satiety dynamics noninvasively in daily life settings
Integrating physiological PPG signals with meal descriptions for dietary monitoring
Improving caloric intake prediction accuracy using multimodal wearable data
Innovation

Methods, ideas, or system contributions that make the work stand out.

Integrates photoplethysmography with meal descriptions
Projects PPG into language model interpretable embeddings
Enables joint reasoning over physiology and meal context
🔎 Similar Papers
No similar papers found.