MAROON: A Dataset for the Joint Characterization of Near-Field High-Resolution Radio-Frequency and Optical Depth Imaging Techniques

📅 2024-11-01
📈 Citations: 0
✨ Influential: 0
📄 PDF
🤖 AI Summary
This study investigates depth measurement performance disparities among optical sensors (structured light, time-of-flight, stereo vision) and high-resolution imaging radar in decimeter-scale near-field scenarios. Addressing systematic biases induced by material properties (including transparent and low-reflectivity objects), geometric complexity, and varying distances, we introduce MAROON—a novel multimodal joint-representation dataset featuring synchronized depth maps, point clouds, and unified spatial calibration parameters from three optical modalities and RF-based radar. We propose a cross-modal joint calibration and error modeling framework that characterizes RF scattering responses in partially transmissive materials. Experiments demonstrate that RF imaging achieves significantly higher depth completeness on transparent and low-reflectivity objects, whereas optical sensors deliver superior accuracy on texture-rich surfaces. The MAROON dataset is publicly released to support benchmarking and algorithm development in multimodal depth sensing.

Technology Category

Application Category

📝 Abstract
Utilizing the complementary strengths of wavelength-specific range or depth sensors is crucial for robust computer-assisted tasks such as autonomous driving. Despite this, there is still little research done at the intersection of optical depth sensors and radars operating close range, where the target is decimeters away from the sensors. Together with a growing interest in high-resolution imaging radars operating in the near field, the question arises how these sensors behave in comparison to their traditional optical counterparts. In this work, we take on the unique challenge of jointly characterizing depth imagers from both, the optical and radio-frequency domain using a multimodal spatial calibration. We collect data from four depth imagers, with three optical sensors of varying operation principle and an imaging radar. We provide a comprehensive evaluation of their depth measurements with respect to distinct object materials, geometries, and object-to-sensor distances. Specifically, we reveal scattering effects of partially transmissive materials and investigate the response of radio-frequency signals. All object measurements will be made public in form of a multimodal dataset, called MAROON.
Problem

Research questions and friction points this paper is trying to address.

Characterizing optical and radar depth sensors in near-field scenarios
Evaluating depth measurements across materials, geometries, and distances
Investigating scattering effects and radio-frequency signal responses
Innovation

Methods, ideas, or system contributions that make the work stand out.

Multimodal spatial calibration for joint characterization
Four depth imagers including optical and radar sensors
Public multimodal dataset evaluating material and distance effects
🔎 Similar Papers
No similar papers found.