๐ค AI Summary
Real-world 3D Gaussian Splatting (3DGS) scenes lack high-fidelity six-degree-of-freedom (6-DoF) VR navigation data, hindering research in viewport prediction, adaptive streaming, and foveated rendering. To address this, we introduce the first 6-DoF navigation dataset specifically designed for real-scene 3DGS reconstructions, comprising 12 high-quality scenes and synchronized head and eye movement trajectories from 46 VR users. We propose a scene initialization calibration method to ensure VR comfort and perceptual consistency. Furthermore, we open-source the SIBR Viewer toolchainโa 3DGS-optimized platform supporting recording and playback for visualization and analysis. This dataset and toolkit collectively advance research in 3DGS-driven VR interaction modeling, rendering optimization, and perception-aware algorithm evaluation, establishing a foundational resource for immersive 3D content delivery.
๐ Abstract
3D Gaussian Splatting (3DGS) is an emerging media representation that reconstructs real-world 3D scenes in high fidelity, enabling 6-degrees-of-freedom (6-DoF) navigation in virtual reality (VR). However, developing and evaluating 3DGS-enabled applications and optimizing their rendering performance, require realistic user navigation data. Such data is currently unavailable for photorealistic 3DGS reconstructions of real-world scenes. This paper introduces EyeNavGS (EyeNavGS), the first publicly available 6-DoF navigation dataset featuring traces from 46 participants exploring twelve diverse, real-world 3DGS scenes. The dataset was collected at two sites, using the Meta Quest Pro headsets, recording the head pose and eye gaze data for each rendered frame during free world standing 6-DoF navigation. For each of the twelve scenes, we performed careful scene initialization to correct for scene tilt and scale, ensuring a perceptually-comfortable VR experience. We also release our open-source SIBR viewer software fork with record-and-replay functionalities and a suite of utility tools for data processing, conversion, and visualization. The EyeNavGS dataset and its accompanying software tools provide valuable resources for advancing research in 6-DoF viewport prediction, adaptive streaming, 3D saliency, and foveated rendering for 3DGS scenes. The EyeNavGS dataset is available at: https://symmru.github.io/EyeNavGS/.