InstantHDR: Single-forward Gaussian Splatting for High Dynamic Range 3D Reconstruction

📅 2026-03-11
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work proposes InstantHDR, the first feed-forward neural network capable of reconstructing high-dynamic-range (HDR) 3D scenes from uncalibrated multi-exposure low-dynamic-range (LDR) images in a single forward pass, eliminating the need for per-scene optimization. Existing HDR reconstruction methods typically rely on known camera poses, dense point cloud initialization, and time-consuming optimization. In contrast, InstantHDR introduces a geometry-guided multi-exposure fusion mechanism, a generalizable meta-learned tone-mapping network, and an efficient Gaussian splatting-based scene representation. To support training, the authors also present HDR-Pretrain, the first large-scale synthetic dataset tailored for feed-forward HDR reconstruction. Experiments demonstrate that InstantHDR achieves reconstruction quality comparable to state-of-the-art optimization-based methods while offering speedups of approximately 700× without post-optimization or 20× with lightweight refinement.

Technology Category

Application Category

📝 Abstract
High dynamic range (HDR) novel view synthesis (NVS) aims to reconstruct HDR scenes from multi-exposure low dynamic range (LDR) images. Existing HDR pipelines heavily rely on known camera poses, well-initialized dense point clouds, and time-consuming per-scene optimization. Current feed-forward alternatives overlook the HDR problem by assuming exposure-invariant appearance. To bridge this gap, we propose InstantHDR, a feed-forward network that reconstructs 3D HDR scenes from uncalibrated multi-exposure LDR collections in a single forward pass. Specifically, we design a geometry-guided appearance modeling for multi-exposure fusion, and a meta-network for generalizable scene-specific tone mapping. Due to the lack of HDR scene data, we build a pre-training dataset, called HDR-Pretrain, for generalizable feed-forward HDR models, featuring 168 Blender-rendered scenes, diverse lighting types, and multiple camera response functions. Comprehensive experiments show that our InstantHDR delivers comparable synthesis performance to the state-of-the-art optimization-based HDR methods while enjoying $\sim700\times$ and $\sim20\times$ reconstruction speed improvement with our single-forward and post-optimization settings. All code, models, and datasets will be released after the review process.
Problem

Research questions and friction points this paper is trying to address.

HDR
novel view synthesis
multi-exposure
3D reconstruction
feed-forward
Innovation

Methods, ideas, or system contributions that make the work stand out.

feed-forward HDR reconstruction
geometry-guided appearance modeling
meta-network tone mapping
multi-exposure fusion
HDR-Pretrain dataset
🔎 Similar Papers
No similar papers found.