🤖 AI Summary
Traditional indoor thermal reconstruction relies on labor-intensive on-site measurements and manual annotations. To address this, this work proposes an end-to-end method for automatic reconstruction of indoor radiative–thermal distributions from a single indoor–outdoor HDR panoramic image. Our approach jointly infers 3D room layout, spatially varying illumination, and material properties, and—crucially—introduces the first physics-based coupling model linking light transport to transient heat conduction, enabling fully automatic, annotation-free, and measurement-free digital thermal reconstruction. Key technical components include HDR panoramic parsing, single-image 3D layout estimation, environment-map–driven joint inverse rendering of lighting and materials, and sensitivity-guided efficient thermal simulation. Experiments demonstrate high fidelity: synthesized thermal panoramas achieve mean absolute error <1.2°C against ground-truth thermal imaging data, validating the method’s accuracy and practicality in real-world scenarios.
📝 Abstract
This paper presents a novel application for directly estimating indoor light and heat maps from captured indoor-outdoor High Dynamic Range (HDR) panoramas. In our image-based rendering method, the indoor panorama is used to estimate the 3D room layout, while the corresponding outdoor panorama serves as an environment map to infer spatially-varying light and material properties. We establish a connection between indoor light transport and heat transport and implement transient heat simulation to generate indoor heat panoramas. The sensitivity analysis of various thermal parameters is conducted, and the resulting heat maps are compared with the images captured by the thermal camera in real-world scenarios. This digital application enables automatic indoor light and heat estimation without manual inputs and cumbersome field measurements.