🤖 AI Summary
This work addresses the challenge of high-quality reconstruction of autonomous driving scenes under low-light conditions, where existing methods suffer from inadequate modeling of material reflectance under complex illumination. To overcome this limitation, the paper introduces physically based rendering into the 3D Gaussian Splatting framework for the first time. It jointly optimizes BRDF material parameters and explicitly models diffuse and specular reflectance components using a global illumination module and an anisotropic spherical Gaussian representation, respectively. This approach enables high-fidelity, real-time reconstruction of outdoor nighttime scenes. Evaluated on real-world nighttime datasets from nuScenes and Waymo, the proposed method outperforms state-of-the-art techniques in both quantitative metrics and qualitative visual quality while maintaining real-time rendering performance.
📝 Abstract
This paper focuses on scene reconstruction under nighttime conditions in autonomous driving simulation. Recent methods based on Neural Radiance Fields (NeRFs) and 3D Gaussian Splatting (3DGS) have achieved photorealistic modeling in autonomous driving scene reconstruction, but they primarily focus on normal-light conditions. Low-light driving scenes are more challenging to model due to their complex lighting and appearance conditions, which often causes performance degradation of existing methods. To address this problem, this work presents a novel approach that integrates physically based rendering into 3DGS to enhance nighttime scene reconstruction for autonomous driving. Specifically, our approach integrates physically based rendering into composite scene Gaussian representations and jointly optimizes Bidirectional Reflectance Distribution Function (BRDF) based material properties. We explicitly model diffuse components through a global illumination module and specular components by anisotropic spherical Gaussians. As a result, our approach improves reconstruction quality for outdoor nighttime driving scenes, while maintaining real-time rendering. Extensive experiments across diverse nighttime scenarios on two real-world autonomous driving datasets, including nuScenes and Waymo, demonstrate that our approach outperforms the state-of-the-art methods both quantitatively and qualitatively.