π€ AI Summary
Visual-only virtual reality (VR) βportalβ metaphors suffer from inadequate scene recognition under visually constrained conditions. Method: This study pioneers a multisensory integration framework within this paradigm, systematically incorporating congruent auditory and olfactory cues while rigorously controlling visual input, and conducts a controlled user study. Contribution/Results: Behavioral analysis reveals that adding matched sound and odor significantly improves scene identification accuracy (+28.3%) and reduces response time (β31.7%) compared to vision-only conditions, empirically demonstrating effective cross-modal compensation for visual deficits. The work introduces a novel immersive interaction paradigm that mitigates VRβs overreliance on visual modality, thereby enhancing remote environmental awareness. It provides both empirical evidence and methodological guidance for designing multimodal VR systems that robustly support perception under sensory limitation.
π Abstract
While Virtual Reality (VR) systems have become increasingly immersive, they still rely predominantly on visual input, which can constrain perceptual performance when visual information is limited. Incorporating additional sensory modalities, such as sound and scent, offers a promising strategy to enhance user experience and overcome these limitations. This paper investigates the contribution of auditory and olfactory cues in supporting perception within the portal metaphor, a VR technique that reveals remote environments through narrow, visually constrained transitions. We conducted a user study in which participants identified target scenes by selecting the correct portal among alternatives under varying sensory conditions. The results demonstrate that integrating visual, auditory, and olfactory cues significantly improved both recognition accuracy and response time. These findings highlight the potential of multisensory integration to compensate for visual constraints in VR and emphasize the value of incorporating sound and scent to enhance perception, immersion, and interaction within future VR system designs.