Modeling the Impact of Visual Stimuli on Redirection Noticeability with Gaze Behavior in Virtual Reality

📅 2025-02-14
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study addresses visual–proprioceptive conflict induced by motion redirection in virtual reality (VR), investigating how environmental visual stimuli modulate user gaze behavior to influence redirection detectability. We propose the first gaze-driven regression model that jointly encodes scene-level visual features and real-time eye-tracking signals, enabling cross-scene generalization. Furthermore, we design a perception-aware adaptive redirection framework that dynamically adjusts redirection parameters based on predicted detectability. A three-phase user study validates our approach: the model achieves a mean squared error (MSE) of only 0.012 when predicting redirection detectability in unseen scenes; subjective workload is significantly reduced; and users report enhanced sense of embodiment. These results demonstrate the effectiveness and practicality of our method in complex, realistic VR environments.

Technology Category

Application Category

📝 Abstract
While users could embody virtual avatars that mirror their physical movements in Virtual Reality, these avatars' motions can be redirected to enable novel interactions. Excessive redirection, however, could break the user's sense of embodiment due to perceptual conflicts between vision and proprioception. While prior work focused on avatar-related factors influencing the noticeability of redirection, we investigate how the visual stimuli in the surrounding virtual environment affect user behavior and, in turn, the noticeability of redirection. Given the wide variety of different types of visual stimuli and their tendency to elicit varying individual reactions, we propose to use users' gaze behavior as an indicator of their response to the stimuli and model the noticeability of redirection. We conducted two user studies to collect users' gaze behavior and noticeability, investigating the relationship between them and identifying the most effective gaze behavior features for predicting noticeability. Based on the data, we developed a regression model that takes users' gaze behavior as input and outputs the noticeability of redirection. We then conducted an evaluation study to test our model on unseen visual stimuli, achieving an accuracy of 0.012 MSE. We further implemented an adaptive redirection technique and conducted a proof-of-concept study to evaluate its effectiveness with complex visual stimuli in two applications. The results indicated that participants experienced less physical demanding and a stronger sense of body ownership when using our adaptive technique, demonstrating the potential of our model to support real-world use cases.
Problem

Research questions and friction points this paper is trying to address.

Analyzes visual stimuli's effect on VR redirection noticeability
Uses gaze behavior to model redirection perception
Develops adaptive redirection for enhanced user embodiment
Innovation

Methods, ideas, or system contributions that make the work stand out.

Gaze behavior predicts redirection noticeability
Regression model analyzes visual stimuli impact
Adaptive technique enhances virtual embodiment experience
🔎 Similar Papers
No similar papers found.