PBR-NeRF: Inverse Rendering with Physics-Based Neural Fields

πŸ“… 2024-12-12
πŸ›οΈ arXiv.org
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
To address the ill-posedness in NeRF-based inverse rendering arising from entanglement between material and illumination, this paper introduces the first physically based rendering (PBR)-guided NeRF framework. Our method jointly optimizes geometry, material (BRDF), and environment lighting in a differentiable manner. Key innovations include two physics-driven, differentiable loss terms that explicitly regularize BRDF parameters and environmental illumination, enabling effective decoupling of material and lighting estimation. We further adopt multi-scale parameterization and differentiable rendering to balance novel-view synthesis quality and material fidelity. Extensive evaluations on multiple benchmarks demonstrate significant improvements in material separation accuracy over state-of-the-art methods, while novel-view synthesis PSNR matches top-performing NeRF models. The implementation is open-sourced and designed for plug-and-play integration.

Technology Category

Application Category

πŸ“ Abstract
We tackle the ill-posed inverse rendering problem in 3D reconstruction with a Neural Radiance Field (NeRF) approach informed by Physics-Based Rendering (PBR) theory, named PBR-NeRF. Our method addresses a key limitation in most NeRF and 3D Gaussian Splatting approaches: they estimate view-dependent appearance without modeling scene materials and illumination. To address this limitation, we present an inverse rendering (IR) model capable of jointly estimating scene geometry, materials, and illumination. Our model builds upon recent NeRF-based IR approaches, but crucially introduces two novel physics-based priors that better constrain the IR estimation. Our priors are rigorously formulated as intuitive loss terms and achieve state-of-the-art material estimation without compromising novel view synthesis quality. Our method is easily adaptable to other inverse rendering and 3D reconstruction frameworks that require material estimation. We demonstrate the importance of extending current neural rendering approaches to fully model scene properties beyond geometry and view-dependent appearance. Code is publicly available at https://github.com/s3anwu/pbrnerf
Problem

Research questions and friction points this paper is trying to address.

Inverse rendering for 3D scene reconstruction
Joint estimation of geometry, materials, illumination
Physics-based constraints for material estimation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Physics-based Neural Radiance Field approach
Joint estimation of geometry, materials, illumination
Novel physics-based priors as loss terms
πŸ”Ž Similar Papers
No similar papers found.