🤖 AI Summary
Current medical simulators struggle to generate multimodal pain feedback aligned with human perception, particularly exhibiting modeling bottlenecks in dynamically mapping palpation force to auditory pain expression. To address this, we propose the first adaptive-voicing robopatient system, integrating high-fidelity force sensing, gender-specific acoustic modeling, and a Proximal Policy Optimization (PPO)-based reinforcement learning framework. Crucially, we introduce real-time human preference feedback for dynamic calibration within continuous force space. Our study reveals, for the first time, a gender-specific force-threshold saturation effect in auditory pain perception: females exhibit heightened sensitivity to low-force palpation. The system comprehensively covers the full spectrum of pain intensity—from mild discomfort to acute distress—and achieves a 42% reduction in individualized adaptation error. This significantly enhances both the fidelity and personalization of abdominal palpation training.
📝 Abstract
Diagnostic errors remain a major cause of preventable deaths, particularly in resource-limited regions. Medical training simulators, including robopatients, play a vital role in reducing these errors by mimicking real patients for procedural training such as palpation. However, generating multimodal feedback, especially auditory pain expressions, remains challenging due to the complex relationship between palpation behavior and sound. The high-dimensional nature of pain sounds makes exploration challenging with conventional methods. This study introduces a novel experimental paradigm for pain expressivity in robopatients where they dynamically generate auditory pain expressions in response to palpation force, by co-optimizing human feedback using machine learning. Using Proximal Policy Optimization (PPO), a reinforcement learning (RL) technique optimized for continuous adaptation, our robot iteratively refines pain sounds based on real-time human feedback. This robot initializes randomized pain responses to palpation forces, and the RL agent learns to adjust these sounds to align with human preferences. The results demonstrated that the system adapts to an individual's palpation forces and sound preferences and captures a broad spectrum of pain intensity, from mild discomfort to acute distress, through RL-guided exploration of the auditory pain space. The study further showed that pain sound perception exhibits saturation at lower forces with gender specific thresholds. These findings highlight the system's potential to enhance abdominal palpation training by offering a controllable and immersive simulation platform.