A3FR: Agile 3D Gaussian Splatting with Incremental Gaze Tracked Foveated Rendering in Virtual Reality

📅 2025-07-05
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Real-time rendering of 3D Gaussian Splatting (3DGS) in VR suffers from high latency, while conventional foveated rendering—relying on eye-tracking—often exacerbates latency due to tracking overhead. Method: This paper proposes an efficient, parallel framework that tightly integrates eye-tracking with foveated rendering. It jointly couples incremental gaze prediction and dynamic-resolution foveated rendering, implemented as an end-to-end pipelined execution on GPU to eliminate serial dependencies and associated latency. Built upon 3DGS, the framework adaptively modulates resolution and Gaussian point sampling density within the retinal fovea, preserving perceptual fidelity. Contribution/Results: Experiments demonstrate up to 2× reduction in end-to-end rendering latency while maintaining stable subjective quality across diverse VR scenes. The approach delivers a scalable, system-level solution for high-fidelity, low-latency VR rendering.

Technology Category

Application Category

📝 Abstract
Virtual reality (VR) significantly transforms immersive digital interfaces, greatly enhancing education, professional practices, and entertainment by increasing user engagement and opening up new possibilities in various industries. Among its numerous applications, image rendering is crucial. Nevertheless, rendering methodologies like 3D Gaussian Splatting impose high computational demands, driven predominantly by user expectations for superior visual quality. This results in notable processing delays for real-time image rendering, which greatly affects the user experience. Additionally, VR devices such as head-mounted displays (HMDs) are intricately linked to human visual behavior, leveraging knowledge from perception and cognition to improve user experience. These insights have spurred the development of foveated rendering, a technique that dynamically adjusts rendering resolution based on the user's gaze direction. The resultant solution, known as gaze-tracked foveated rendering, significantly reduces the computational burden of the rendering process. Although gaze-tracked foveated rendering can reduce rendering costs, the computational overhead of the gaze tracking process itself can sometimes outweigh the rendering savings, leading to increased processing latency. To address this issue, we propose an efficient rendering framework called~ extit{A3FR}, designed to minimize the latency of gaze-tracked foveated rendering via the parallelization of gaze tracking and foveated rendering processes. For the rendering algorithm, we utilize 3D Gaussian Splatting, a state-of-the-art neural rendering technique. Evaluation results demonstrate that A3FR can reduce end-to-end rendering latency by up to $2 imes$ while maintaining visual quality.
Problem

Research questions and friction points this paper is trying to address.

Reducing computational demands in VR 3D Gaussian Splatting
Minimizing latency in gaze-tracked foveated rendering systems
Balancing visual quality and real-time rendering performance
Innovation

Methods, ideas, or system contributions that make the work stand out.

Parallelizes gaze tracking and foveated rendering
Uses 3D Gaussian Splatting for neural rendering
Reduces latency by 2x without quality loss
🔎 Similar Papers
No similar papers found.
S
Shuo Xin
Tandon School of Engineering, New York University, New York, USA
H
Haiyu Wang
Tandon School of Engineering, New York University, New York, USA
Sai Qian Zhang
Sai Qian Zhang
New York University