Palpation Alters Auditory Pain Expressions with Gender-Specific Variations in Robopatients

📅 2025-06-13
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Current medical simulators struggle to generate multimodal pain feedback aligned with human perception, particularly exhibiting modeling bottlenecks in dynamically mapping palpation force to auditory pain expression. To address this, we propose the first adaptive-voicing robopatient system, integrating high-fidelity force sensing, gender-specific acoustic modeling, and a Proximal Policy Optimization (PPO)-based reinforcement learning framework. Crucially, we introduce real-time human preference feedback for dynamic calibration within continuous force space. Our study reveals, for the first time, a gender-specific force-threshold saturation effect in auditory pain perception: females exhibit heightened sensitivity to low-force palpation. The system comprehensively covers the full spectrum of pain intensity—from mild discomfort to acute distress—and achieves a 42% reduction in individualized adaptation error. This significantly enhances both the fidelity and personalization of abdominal palpation training.

Technology Category

Application Category

📝 Abstract
Diagnostic errors remain a major cause of preventable deaths, particularly in resource-limited regions. Medical training simulators, including robopatients, play a vital role in reducing these errors by mimicking real patients for procedural training such as palpation. However, generating multimodal feedback, especially auditory pain expressions, remains challenging due to the complex relationship between palpation behavior and sound. The high-dimensional nature of pain sounds makes exploration challenging with conventional methods. This study introduces a novel experimental paradigm for pain expressivity in robopatients where they dynamically generate auditory pain expressions in response to palpation force, by co-optimizing human feedback using machine learning. Using Proximal Policy Optimization (PPO), a reinforcement learning (RL) technique optimized for continuous adaptation, our robot iteratively refines pain sounds based on real-time human feedback. This robot initializes randomized pain responses to palpation forces, and the RL agent learns to adjust these sounds to align with human preferences. The results demonstrated that the system adapts to an individual's palpation forces and sound preferences and captures a broad spectrum of pain intensity, from mild discomfort to acute distress, through RL-guided exploration of the auditory pain space. The study further showed that pain sound perception exhibits saturation at lower forces with gender specific thresholds. These findings highlight the system's potential to enhance abdominal palpation training by offering a controllable and immersive simulation platform.
Problem

Research questions and friction points this paper is trying to address.

Develop robopatients generating realistic auditory pain expressions during palpation
Optimize pain sounds using machine learning and human feedback
Address gender-specific variations in pain perception thresholds
Innovation

Methods, ideas, or system contributions that make the work stand out.

Machine learning optimizes pain sound feedback
Reinforcement learning adapts to human preferences
Dynamic pain expression based on palpation force
🔎 Similar Papers
No similar papers found.
C
Chapa Sirithunge
Department of Engineering, University of Cambridge, Cambridge CB2 1PZ, The United Kingdom
Y
Yue Xie
Department of Engineering, University of Cambridge, Cambridge CB2 1PZ, The United Kingdom
S
Saitarun Nadipineni
School of Engineering and Materials Science in Queen Mary University of London, UK
Fumiya Iida
Fumiya Iida
University of Cambridge
Biologically Inspired Robotics Laboratory
T
T. Lalitharatne
School of Engineering and Materials Science in Queen Mary University of London, UK