🤖 AI Summary
This paper addresses the problem of generating spatially varying high-dynamic-range (HDR) environment lighting from a single LDR input image for physically plausible 3D rendering—bypassing reliance on HDR inputs or suffering from spatial inconsistency. To this end, we propose a novel differentiable light representation based on HDR Gaussian splats, establishing an “image-as-light-source” paradigm. Our method integrates a diffusion-based dynamic range expansion module with 3D Gaussian splat modeling and spatially varying rendering in an end-to-end trainable framework for HDR illumination estimation. Key contributions include: (1) the first saturation-free, calibration-complete HDR lighting evaluation dataset; (2) state-of-the-art performance on both our curated benchmark and public datasets; and (3) significantly improved lighting consistency and photorealism in virtual object compositing.
📝 Abstract
We present GaSLight, a method that generates spatially-varying lighting from regular images. Our method proposes using HDR Gaussian Splats as light source representation, marking the first time regular images can serve as light sources in a 3D renderer. Our two-stage process first enhances the dynamic range of images plausibly and accurately by leveraging the priors embedded in diffusion models. Next, we employ Gaussian Splats to model 3D lighting, achieving spatially variant lighting. Our approach yields state-of-the-art results on HDR estimations and their applications in illuminating virtual objects and scenes. To facilitate the benchmarking of images as light sources, we introduce a novel dataset of calibrated and unsaturated HDR to evaluate images as light sources. We assess our method using a combination of this novel dataset and an existing dataset from the literature. The code to reproduce our method will be available upon acceptance.