Spatial Calibration of Diffuse LiDARs

📅 2026-03-06
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of precise spatial alignment between diffuse direct time-of-flight (dToF) LiDAR and RGB images, which is hindered by the wide field of view that invalidates the conventional single-ray assumption. The authors propose a spatial calibration method that scans reflective patches and employs background subtraction to estimate, for each LiDAR pixel, its effective support region and relative spatial sensitivity on the RGB image plane. This approach yields the first spatial response map for dToF LiDAR pixels, overcoming the limitations of traditional cross-modal calibration that relies on the single-ray model. Experimental validation on the ams OSRAM TMF8828 sensor demonstrates that the method establishes high-precision cross-modal correspondences, significantly enhancing multimodal fusion quality.

Technology Category

Application Category

📝 Abstract
Diffuse direct time-of-flight LiDARs report per-pixel depth histograms formed by aggregating photon returns over a wide instantaneous field of view, violating the single-ray assumption behind standard LiDAR-RGB calibration. We present a simple spatial calibration procedure that estimates, for each diffuse LiDAR pixel, its footprint (effective support region) and relative spatial sensitivity in a co-located RGB image plane. Using a scanned retroreflective patch with background subtraction, we recover per-pixel response maps that provide an explicit LiDAR-to-RGB correspondence for cross-modal alignment and fusion. We demonstrate the method on the ams OSRAM TMF8828.
Problem

Research questions and friction points this paper is trying to address.

Diffuse LiDAR
Spatial Calibration
LiDAR-RGB Alignment
Depth Histogram
Cross-modal Fusion
Innovation

Methods, ideas, or system contributions that make the work stand out.

diffuse LiDAR
spatial calibration
time-of-flight
cross-modal alignment
footprint estimation
🔎 Similar Papers
No similar papers found.