EyeNavGS: A 6-DoF Navigation Dataset and Record-n-Replay Software for Real-World 3DGS Scenes in VR

๐Ÿ“… 2025-06-03
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
Real-world 3D Gaussian Splatting (3DGS) scenes lack high-fidelity six-degree-of-freedom (6-DoF) VR navigation data, hindering research in viewport prediction, adaptive streaming, and foveated rendering. To address this, we introduce the first 6-DoF navigation dataset specifically designed for real-scene 3DGS reconstructions, comprising 12 high-quality scenes and synchronized head and eye movement trajectories from 46 VR users. We propose a scene initialization calibration method to ensure VR comfort and perceptual consistency. Furthermore, we open-source the SIBR Viewer toolchainโ€”a 3DGS-optimized platform supporting recording and playback for visualization and analysis. This dataset and toolkit collectively advance research in 3DGS-driven VR interaction modeling, rendering optimization, and perception-aware algorithm evaluation, establishing a foundational resource for immersive 3D content delivery.

Technology Category

Application Category

๐Ÿ“ Abstract
3D Gaussian Splatting (3DGS) is an emerging media representation that reconstructs real-world 3D scenes in high fidelity, enabling 6-degrees-of-freedom (6-DoF) navigation in virtual reality (VR). However, developing and evaluating 3DGS-enabled applications and optimizing their rendering performance, require realistic user navigation data. Such data is currently unavailable for photorealistic 3DGS reconstructions of real-world scenes. This paper introduces EyeNavGS (EyeNavGS), the first publicly available 6-DoF navigation dataset featuring traces from 46 participants exploring twelve diverse, real-world 3DGS scenes. The dataset was collected at two sites, using the Meta Quest Pro headsets, recording the head pose and eye gaze data for each rendered frame during free world standing 6-DoF navigation. For each of the twelve scenes, we performed careful scene initialization to correct for scene tilt and scale, ensuring a perceptually-comfortable VR experience. We also release our open-source SIBR viewer software fork with record-and-replay functionalities and a suite of utility tools for data processing, conversion, and visualization. The EyeNavGS dataset and its accompanying software tools provide valuable resources for advancing research in 6-DoF viewport prediction, adaptive streaming, 3D saliency, and foveated rendering for 3DGS scenes. The EyeNavGS dataset is available at: https://symmru.github.io/EyeNavGS/.
Problem

Research questions and friction points this paper is trying to address.

Lack of realistic 6-DoF navigation data for 3DGS scenes
Need for tools to evaluate 3DGS rendering performance in VR
Absence of public datasets for 3DGS-based VR research
Innovation

Methods, ideas, or system contributions that make the work stand out.

6-DoF navigation dataset for 3DGS scenes
Record-and-replay VR software with gaze tracking
Open-source tools for 3DGS data processing
๐Ÿ”Ž Similar Papers
No similar papers found.