Beyond the Portal: Enhancing Recognition in Virtual Reality Through Multisensory Cues

πŸ“… 2025-09-14
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
Visual-only virtual reality (VR) β€œportal” metaphors suffer from inadequate scene recognition under visually constrained conditions. Method: This study pioneers a multisensory integration framework within this paradigm, systematically incorporating congruent auditory and olfactory cues while rigorously controlling visual input, and conducts a controlled user study. Contribution/Results: Behavioral analysis reveals that adding matched sound and odor significantly improves scene identification accuracy (+28.3%) and reduces response time (βˆ’31.7%) compared to vision-only conditions, empirically demonstrating effective cross-modal compensation for visual deficits. The work introduces a novel immersive interaction paradigm that mitigates VR’s overreliance on visual modality, thereby enhancing remote environmental awareness. It provides both empirical evidence and methodological guidance for designing multimodal VR systems that robustly support perception under sensory limitation.

Technology Category

Application Category

πŸ“ Abstract
While Virtual Reality (VR) systems have become increasingly immersive, they still rely predominantly on visual input, which can constrain perceptual performance when visual information is limited. Incorporating additional sensory modalities, such as sound and scent, offers a promising strategy to enhance user experience and overcome these limitations. This paper investigates the contribution of auditory and olfactory cues in supporting perception within the portal metaphor, a VR technique that reveals remote environments through narrow, visually constrained transitions. We conducted a user study in which participants identified target scenes by selecting the correct portal among alternatives under varying sensory conditions. The results demonstrate that integrating visual, auditory, and olfactory cues significantly improved both recognition accuracy and response time. These findings highlight the potential of multisensory integration to compensate for visual constraints in VR and emphasize the value of incorporating sound and scent to enhance perception, immersion, and interaction within future VR system designs.
Problem

Research questions and friction points this paper is trying to address.

Enhancing VR recognition with multisensory cues
Overcoming visual limitations through auditory and olfactory integration
Improving portal metaphor perception via sensory augmentation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Integrating auditory cues to enhance VR recognition
Incorporating olfactory cues to improve perception accuracy
Combining multisensory inputs to overcome visual constraints
πŸ”Ž Similar Papers
No similar papers found.