Radiometrically Consistent Gaussian Surfels for Inverse Rendering

📅 2026-03-02
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing inverse rendering methods based on Gaussian splatting struggle to accurately disentangle material properties from complex global illumination—particularly indirect lighting—due to supervision being limited to observed viewpoints. This work proposes a radiance consistency constraint that, for the first time, integrates physics-based rendering into the Gaussian splatting framework. By minimizing the residual between learned radiance and physically rendered results, the method provides self-correcting supervision for unobserved views. Leveraging a Gaussian surfel representation combined with 2D Gaussian ray tracing, we construct an efficient inverse rendering system that enables rapid relighting fine-tuning. Our approach outperforms existing Gaussian-based methods across multiple benchmarks, achieving relighting adaptation in just a few minutes and rendering at under 10 milliseconds per frame, thus balancing accuracy and efficiency.

Technology Category

Application Category

📝 Abstract
Inverse rendering with Gaussian Splatting has advanced rapidly, but accurately disentangling material properties from complex global illumination effects, particularly indirect illumination, remains a major challenge. Existing methods often query indirect radiance from Gaussian primitives pre-trained for novel-view synthesis. However, these pre-trained Gaussian primitives are supervised only towards limited training viewpoints, thus lack supervision for modeling indirect radiances from unobserved views. To address this issue, we introduce radiometric consistency, a novel physically-based constraint that provides supervision towards unobserved views by minimizing the residual between each Gaussian primitive's learned radiance and its physically-based rendered counterpart. Minimizing the residual for unobserved views establishes a self-correcting feedback loop that provides supervision from both physically-based rendering and novel-view synthesis, enabling accurate modeling of inter-reflection. We then propose Radiometrically Consistent Gaussian Surfels (RadioGS), an inverse rendering framework built upon our principle by efficiently integrating radiometric consistency by utilizing Gaussian surfels and 2D Gaussian ray tracing. We further propose a finetuning-based relighting strategy that adapts Gaussian surfel radiances to new illuminations within minutes, achieving low rendering cost (<10ms). Extensive experiments on existing inverse rendering benchmarks show that RadioGS outperforms existing Gaussian-based methods in inverse rendering, while retaining the computational efficiency.
Problem

Research questions and friction points this paper is trying to address.

inverse rendering
global illumination
indirect illumination
radiometric consistency
Gaussian splatting
Innovation

Methods, ideas, or system contributions that make the work stand out.

radiometric consistency
Gaussian surfels
inverse rendering
indirect illumination
physically-based rendering
K
Kyu Beom Han
School of Computing, Korea Advanced Institute of Science and Technology, Daejeon, South Korea
Jaeyoon Kim
Jaeyoon Kim
KAIST
Computer VisionImage Retrieval
W
Woo Jae Kim
School of Computing, Korea Advanced Institute of Science and Technology, Daejeon, South Korea
Jinhwan Seo
Jinhwan Seo
KAIST
VLMVideo Understanding
S
Sung-eui Yoon
School of Computing, Korea Advanced Institute of Science and Technology, Daejeon, South Korea