R3GW: Relightable 3D Gaussians for Outdoor Scenes in the Wild

📅 2026-03-03
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing 3D Gaussian splatting methods struggle to reconstruct outdoor scenes under uncontrolled lighting and lack relightability. This work proposes a relightable 3D Gaussian representation that decomposes outdoor scenes into a relightable foreground and a non-reflective background (sky). By integrating physics-based rendering with a foreground-background Gaussian separation strategy, the method effectively mitigates depth-related boundary artifacts. To the best of our knowledge, this is the first approach to enable relightable 3D Gaussian modeling from uncontrolled outdoor image collections. It achieves state-of-the-art performance on the NeRF-OSR dataset, supports physically plausible novel-view synthesis under arbitrary illumination, and significantly improves visual quality in regions where the foreground meets the sky.

Technology Category

Application Category

📝 Abstract
3D Gaussian Splatting (3DGS) has established itself as a leading technique for 3D reconstruction and novel view synthesis of static scenes, achieving outstanding rendering quality and fast training. However, the method does not explicitly model the scene illumination, making it unsuitable for relighting tasks. Furthermore, 3DGS struggles to reconstruct scenes captured in the wild by unconstrained photo collections featuring changing lighting conditions. In this paper, we present R3GW, a novel method that learns a relightable 3DGS representation of an outdoor scene captured in the wild. Our approach separates the scene into a relightable foreground and a non-reflective background (the sky), using two distinct sets of Gaussians. R3GW models view-dependent lighting effects in the foreground reflections by combining Physically Based Rendering with the 3DGS scene representation in a varying illumination setting. We evaluate our method quantitatively and qualitatively on the NeRF-OSR dataset, offering state-of-the-art performance and enhanced support for physically-based relighting of unconstrained scenes. Our method synthesizes photorealistic novel views under arbitrary illumination conditions. Additionally, our representation of the sky mitigates depth reconstruction artifacts, improving rendering quality at the sky-foreground boundary
Problem

Research questions and friction points this paper is trying to address.

relighting
3D Gaussian Splatting
outdoor scenes
unconstrained photo collections
illumination modeling
Innovation

Methods, ideas, or system contributions that make the work stand out.

Relightable 3D Gaussian Splatting
Physically Based Rendering
Outdoor Scene Reconstruction
Unconstrained Photo Collections
View-dependent Lighting
🔎 Similar Papers
No similar papers found.