SeHDR: Single-Exposure HDR Novel View Synthesis via 3D Gaussian Bracketing

📅 2025-09-23
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenging problem of synthesizing HDR novel views from a single-exposure set of multi-view LDR images—bypassing distortions (e.g., motion blur and misalignment) inherent in conventional multi-exposure approaches. We propose *Bracketed 3D Gaussians*, the first method to jointly estimate scene geometry and linear color distributions across multiple exposure values (EVs) within the 3D Gaussian splatting framework, using only single-exposure inputs; colors are represented via spherical harmonics. To fuse these EV-specific 3D Gaussians robustly in HDR space, we introduce a differentiable Neural Exposure Fusion (NeEF) module. Experiments on both synthetic and real-world datasets demonstrate that our approach significantly outperforms existing HDR novel view synthesis methods, achieving state-of-the-art performance in visual quality, dynamic range recovery, and geometric consistency.

Technology Category

Application Category

📝 Abstract
This paper presents SeHDR, a novel high dynamic range 3D Gaussian Splatting (HDR-3DGS) approach for generating HDR novel views given multi-view LDR images. Unlike existing methods that typically require the multi-view LDR input images to be captured from different exposures, which are tedious to capture and more likely to suffer from errors (e.g., object motion blurs and calibration/alignment inaccuracies), our approach learns the HDR scene representation from multi-view LDR images of a single exposure. Our key insight to this ill-posed problem is that by first estimating Bracketed 3D Gaussians (i.e., with different exposures) from single-exposure multi-view LDR images, we may then be able to merge these bracketed 3D Gaussians into an HDR scene representation. Specifically, SeHDR first learns base 3D Gaussians from single-exposure LDR inputs, where the spherical harmonics parameterize colors in a linear color space. We then estimate multiple 3D Gaussians with identical geometry but varying linear colors conditioned on exposure manipulations. Finally, we propose the Differentiable Neural Exposure Fusion (NeEF) to integrate the base and estimated 3D Gaussians into HDR Gaussians for novel view rendering. Extensive experiments demonstrate that SeHDR outperforms existing methods as well as carefully designed baselines.
Problem

Research questions and friction points this paper is trying to address.

Generating HDR novel views from single-exposure LDR images
Overcoming limitations of multi-exposure capture requirements
Learning HDR scene representation from limited exposure data
Innovation

Methods, ideas, or system contributions that make the work stand out.

HDR novel views from single-exposure LDR images
Estimates bracketed 3D Gaussians with varying exposures
Uses Differentiable Neural Exposure Fusion for HDR integration
🔎 Similar Papers
No similar papers found.
Y
Yiyu Li
City University of Hong Kong
Haoyuan Wang
Haoyuan Wang
University of Pennsylvania, Applied Mathematics and Computational Science
Biostatistics
K
Ke Xu
City University of Hong Kong
G
Gerhard Petrus Hancke
City University of Hong Kong
R
Rynson W. H. Lau
City University of Hong Kong