🤖 AI Summary
To address language and motor developmental delays in children with hearing impairment, this study developed a virtual reality (VR)-based multisensory training system integrating auditory, visual, and tactile feedback, centered on pitch-matching tasks to concurrently enhance speech production and motor coordination. Innovatively, it represents the first integration of VR with functional near-infrared spectroscopy (fNIRS) neurofeedback, enabling real-time monitoring and closed-loop neuromodulation of prefrontal and motor cortical activation during rehabilitation. By quantifying neural responses—specifically oxyhemoglobin (HbO) concentration changes—and behavioral metrics—including movement accuracy and reaction time—the intervention demonstrated significant increases in cortical activation, heightened cognitive engagement, and improved motor control in hearing-impaired children. These findings establish a novel paradigm for personalized, immersive neurorehabilitation grounded in objective neurobehavioral biomarkers.
📝 Abstract
Children with hearing impairments face ongoing challenges in language and motor development. This study explores how multi-sensory feedback technology based on virtual reality (VR), integrating auditory, visual, and tactile stimuli, can enhance rehabilitation outcomes. Using functional near-infrared spectroscopy (fNIRS) technology, we assessed cortical activation patterns in children during pitch-matching tasks across different interaction modes. Our findings aim to provide evidence for designing personalized, interactive rehabilitation systems that enhance cognitive engagement and motor control in children with hearing impairments.