EyeNexus: Adaptive Gaze-Driven Quality and Bitrate Streaming for Seamless VR Cloud Gaming Experiences

📅 2025-09-15
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
VR cloud gaming suffers from visual quality fluctuations and cybersickness under dynamic network conditions, primarily due to misalignment between foveated rendering and encoding, as well as lagging bitrate adaptation for high-resolution content delivery. This paper proposes the first unified framework integrating real-time eye-tracking alignment modeling with dynamic foveal region adjustment, jointly optimizing foveated spatial compression (FSC) and foveated video encoding (FVE) to enable bandwidth-driven quality allocation and smooth quality transitions. The method incorporates variable-resolution foveated rendering, eye-movement-guided encoding, real-time bandwidth adaptation, and low-latency transmission. Experimental results demonstrate a 70.9% reduction in end-to-end latency and a 24.6% improvement in perceptual video quality. User studies confirm a 48% enhancement in playability and visual quality, while completely eliminating cybersickness.

Technology Category

Application Category

📝 Abstract
Virtual Reality (VR) cloud gaming systems render the 3D graphics on cloud servers for playing graphically demanding games on VR headsets. Delivering high-resolution game scenes is challenging due to variation in network performance. By leveraging the non-uniform human vision perception, foveated rendering and encoding have proven effective for optimized streaming in constrained networks. SoTA foveation methods either do not incorporate real-time gaze data or are unable to handle variations in network conditions, resulting in a suboptimal user experience. We introduce EyeNexus, a pioneering system that combines real-time gaze-driven spatial compression (FSC) with gaze-driven video encoding (FVE), transforming the gaze point for precise alignment and foveation. We propose a novel foveation model that dynamically adjusts the foveation region based on real-time bandwidth and gaze data. The model simplifies network-aware quality assignment in FVE, ensuring smooth and imperceptible quality gradients. We evaluate EyeNexus using objective and subjective measures with different network conditions and games. EyeNexus reduces latency by up to 70.9% and improves perceptual visual quality by up to 24.6%. Our IRB-approved user study shows that EyeNexus achieves the highest playability and visual quality, with improvements of up to 48%, while eliminating motion sickness.
Problem

Research questions and friction points this paper is trying to address.

Adaptive gaze-driven streaming for VR cloud gaming
Dynamic foveation based on real-time bandwidth and gaze
Reducing latency and improving visual quality in VR
Innovation

Methods, ideas, or system contributions that make the work stand out.

Real-time gaze-driven spatial compression
Dynamic foveation model adjusts based bandwidth
Smooth imperceptible quality gradients encoding
🔎 Similar Papers
No similar papers found.