🤖 AI Summary
Cochlear implant users struggle to achieve spatial hearing in noisy environments, severely impairing their attentional control and speech comprehension. This study proposes the first interdisciplinary framework integrating artificial intelligence, multimodal human-computer interaction, and psychophysical methods to jointly optimize spatial cue encoding strategies for cochlear implants. By leveraging an AI-driven spatial hearing reconstruction algorithm, perceptual training paradigms, and auditory neuroengineering approaches, the framework overcomes the limitations of traditional unimodal interventions. The work establishes an innovative technical pathway and provides empirical foundations for enhancing auditory performance of cochlear implant users in complex acoustic environments.
📝 Abstract
Cochlear implants (CIs) have been developed to the point where they can restore hearing and speech understanding in a large proportion of patients. Although spatial hearing is central to controlling and directing attention and to enabling speech understanding in noisy environments, it has been largely neglected in the past. We propose here a multi-disciplinary research framework in which physicians, psychologists and engineers collaborate to improve spatial hearing for CI users.