RefracGS: Novel View Synthesis Through Refractive Water Surfaces with 3D Gaussian Ray Tracing

πŸ“… 2026-03-23
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
This work addresses the challenging problem of novel view synthesis beneath non-planar refractive water surfaces, where spatially varying optical distortions invalidate the straight-ray assumption of conventional methods, leading to severe artifacts. To overcome this limitation, we propose an end-to-end joint optimization framework that models the dynamic water surface via a neural height field and represents the underwater scene using a 3D Gaussian field. Crucially, we decouple the 3D Gaussian splatting from the explicit refractive surface and introduce, for the first time, a differentiable, refraction-aware Gaussian ray tracing formulation grounded in Snell’s law, enabling efficient rendering along nonlinear light paths. Our method significantly outperforms existing approaches on both synthetic and real-world scenes with complex water surfaces, achieves a 15Γ— faster training speed, and enables real-time, high-fidelity, view-consistent novel view synthesis at 200 FPS.

Technology Category

Application Category

πŸ“ Abstract
Novel view synthesis (NVS) through non-planar refractive surfaces presents fundamental challenges due to severe, spatially varying optical distortions. While recent representations like NeRF and 3D Gaussian Splatting (3DGS) excel at NVS, their assumption of straight-line ray propagation fails under these conditions, leading to significant artifacts. To overcome this limitation, we introduce RefracGS, a framework that jointly reconstructs the refractive water surface and the scene beneath the interface. Our key insight is to explicitly decouple the refractive boundary from the target objects: the refractive surface is modeled via a neural height field, capturing wave geometry, while the underlying scene is represented as a 3D Gaussian field. We formulate a refraction-aware Gaussian ray tracing approach that accurately computes non-linear ray trajectories using Snell's law and efficiently renders the underlying Gaussian field while backpropagating the loss gradients to the parameterized refractive surface. Through end-to-end joint optimization of both representations, our method ensures high-fidelity NVS and view-consistent surface recovery. Experiments on both synthetic and real-world scenes with complex waves demonstrate that RefracGS outperforms prior refractive methods in visual quality, while achieving 15x faster training and real-time rendering at 200 FPS. The project page for RefracGS is available at https://yimgshao.github.io/refracgs/.
Problem

Research questions and friction points this paper is trying to address.

novel view synthesis
refractive surfaces
optical distortion
non-planar refraction
3D scene reconstruction
Innovation

Methods, ideas, or system contributions that make the work stand out.

refractive surface
3D Gaussian Splatting
novel view synthesis
ray tracing
Snell's law
πŸ”Ž Similar Papers
No similar papers found.