Lunar-G2R: Geometry-to-Reflectance Learning for High-Fidelity Lunar BRDF Estimation

📅 2026-01-15
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing lunar rendering approaches often rely on simplified or spatially uniform BRDF models, which fail to accurately capture the local reflectance properties of lunar regolith, thereby limiting high-fidelity rendering and visual navigation. This work proposes a geometry-to-reflectance learning framework that directly predicts spatially varying BRDF parameters from lunar digital elevation models (DEMs), requiring only a single-view image and known illumination–viewing geometry—without multi-view data or specialized hardware. Built upon a U-Net architecture, the method integrates differentiable rendering with a physically based lighting model and optimizes photometric consistency between real and synthetic images in an end-to-end manner. Evaluated on the Tycho crater region, it reduces photometric error by 38% compared to state-of-the-art baselines and achieves significant improvements in PSNR, SSIM, and perceptual similarity, marking the first demonstration of inferring spatially varying lunar surface reflectance solely from terrain geometry.

Technology Category

Application Category

📝 Abstract
We address the problem of estimating realistic, spatially varying reflectance for complex planetary surfaces such as the lunar regolith, which is critical for high-fidelity rendering and vision-based navigation. Existing lunar rendering pipelines rely on simplified or spatially uniform BRDF models whose parameters are difficult to estimate and fail to capture local reflectance variations, limiting photometric realism. We propose Lunar-G2R, a geometry-to-reflectance learning framework that predicts spatially varying BRDF parameters directly from a lunar digital elevation model (DEM), without requiring multi-view imagery, controlled illumination, or dedicated reflectance-capture hardware at inference time. The method leverages a U-Net trained with differentiable rendering to minimize photometric discrepancies between real orbital images and physically based renderings under known viewing and illumination geometry. Experiments on a geographically held-out region of the Tycho crater show that our approach reduces photometric error by 38 % compared to a state-of-the-art baseline, while achieving higher PSNR and SSIM and improved perceptual similarity, capturing fine-scale reflectance variations absent from spatially uniform models. To our knowledge, this is the first method to infer a spatially varying reflectance model directly from terrain geometry.
Problem

Research questions and friction points this paper is trying to address.

lunar reflectance
spatially varying BRDF
photometric realism
planetary surface rendering
geometry-to-reflectance
Innovation

Methods, ideas, or system contributions that make the work stand out.

geometry-to-reflectance learning
spatially varying BRDF
differentiable rendering
lunar regolith
digital elevation model
🔎 Similar Papers
No similar papers found.
C
Clémentine Grethen
IRIT, University of Toulouse, France
N
Nicolas Menga
Airbus Defence and Space, Toulouse, France
R
Roland Brochard
Airbus Defence and Space, Toulouse, France
Géraldine Morin
Géraldine Morin
Professor
Geometric modeling3DMultimedia
S
Simone Gasparini
IRIT, University of Toulouse, France
J
Jérémy Lebreton
Airbus Defence and Space, Toulouse, France
M
Manuel Sanchez-Gestido
European Space Agency (ESA) ESTEC, Noordwijk, The Netherlands