Point Cloud Recombination: Systematic Real Data Augmentation Using Robotic Targets for LiDAR Perception Validation

📅 2025-05-05
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Open-world LiDAR perception validation is hindered by the trade-off between poor controllability of real-world scenarios and physical inaccuracies in simulation. To address this, we propose a physics-informed point cloud recombination method using physical human-shaped targets: high-precision multi-pose, multi-material target point clouds are captured in lab settings using an Ouster OS1-128 LiDAR; these are then registered to 3D meshes and jointly rendered with geometric and intensity attributes before being dynamically fused into real-world road-scene point clouds. The resulting synthetic scenes preserve sensor-level physical fidelity—including material-dependent intensity response—while enabling full scene controllability. This work presents the first systematic recombination of physical target point clouds with field-collected data, supporting fine-grained occlusion modeling and joint algorithm-sensor robustness attribution. Experiments show reconstruction error <2.1% versus ground truth, significantly improving reproducibility of edge cases and credibility of failure root-cause analysis.

Technology Category

Application Category

📝 Abstract
The validation of LiDAR-based perception of intelligent mobile systems operating in open-world applications remains a challenge due to the variability of real environmental conditions. Virtual simulations allow the generation of arbitrary scenes under controlled conditions but lack physical sensor characteristics, such as intensity responses or material-dependent effects. In contrast, real-world data offers true sensor realism but provides less control over influencing factors, hindering sufficient validation. Existing approaches address this problem with augmentation of real-world point cloud data by transferring objects between scenes. However, these methods do not consider validation and remain limited in controllability because they rely on empirical data. We solve these limitations by proposing Point Cloud Recombination, which systematically augments captured point cloud scenes by integrating point clouds acquired from physical target objects measured in controlled laboratory environments. Thus enabling the creation of vast amounts and varieties of repeatable, physically accurate test scenes with respect to phenomena-aware occlusions with registered 3D meshes. Using the Ouster OS1-128 Rev7 sensor, we demonstrate the augmentation of real-world urban and rural scenes with humanoid targets featuring varied clothing and poses, for repeatable positioning. We show that the recombined scenes closely match real sensor outputs, enabling targeted testing, scalable failure analysis, and improved system safety. By providing controlled yet sensor-realistic data, our method enables trustworthy conclusions about the limitations of specific sensors in compound with their algorithms, e.g., object detection.
Problem

Research questions and friction points this paper is trying to address.

Validating LiDAR perception in variable real-world conditions
Augmenting real point clouds with controlled lab-measured targets
Enabling repeatable, physically accurate test scenes for sensor validation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Systematic augmentation using robotic targets
Controlled lab-measured point cloud integration
Phenomena-aware occlusions with 3D meshes
🔎 Similar Papers
No similar papers found.
H
Hubert Padusinski
FZI Research Center for Information Technology, Karlsruhe, Germany
Christian Steinhauser
Christian Steinhauser
FZI Research Center for Information Technology
Automotive TestingHardware-in-the-Loop Testing
C
Christian Scherl
FZI Research Center for Information Technology, Karlsruhe, Germany
J
Julian Gaal
ANavS GmbH - Advanced Navigation Solutions, München, Germany
Jacob Langner
Jacob Langner
FZI Forschungszentrum Informatik
Big Data MethodsMachine LearningAutomotive Function Development