🤖 AI Summary
This study addresses the limited realism of robotic patients (RoboPatients) in clinical palpation training. To enhance perceptual fidelity, we propose a haptic–auditory cross-modal协同 pain expression synthesis method. Through abdominal phantom-based sensing and psychophysical experiments, we first identify pitch and amplitude as the core acoustic dimensions governing haptic–auditory consistency in pain perception. Based on this finding, we develop an adaptive haptic–acoustic response mapping model and integrate it into a multimodal synchronized synthesis engine, enabling real-time, physiologically plausible vocal and facial pain expressions during palpation. Evaluated across 7,680 trials, our approach demonstrates that pitch predominantly governs perceptual consistency (p < 0.001), while amplitude exhibits a strong correlation with applied pressure (r = 0.92). These results significantly improve the realism of pain expression and pedagogical effectiveness in clinical simulation training.
📝 Abstract
Misdiagnosis can lead to delayed treatments and harm. Robotic patients offer a controlled way to train and evaluate clinicians in rare, subtle, or complex cases, reducing diagnostic errors. We present RoboPatient, a medical robotic simulator aimed at multimodal pain synthesis based on haptic and auditory feedback during palpation-based training scenarios. The robopatient functions as an adaptive intermediary, capable of synthesizing plausible pain expressions vocal and facial in response to tactile stimuli generated during palpation. Using an abdominal phantom, robopatient captures and processes haptic input via an internal palpation-to-pain mapping model. To evaluate perceptual congruence between palpation and the corresponding auditory output, we conducted a study involving 7680 trials across 20 participants, where they evaluated pain intensity through sound. Results show that amplitude and pitch significantly influence agreement with the robot's pain expressions, irrespective of pain sounds. Stronger palpation forces elicited stronger agreement, aligning with psychophysical patterns. The study revealed two key dimensions: pitch and amplitude are central to how people perceive pain sounds, with pitch being the most influential cue. These acoustic features shape how well the sound matches the applied force during palpation, impacting perceived realism. This approach lays the groundwork for high-fidelity robotic patients in clinical education and diagnostic simulation.