NeRF-To-Real Tester: Neural Radiance Fields as Test Image Generators for Vision of Autonomous Systems

๐Ÿ“… 2024-12-20
๐Ÿ›๏ธ arXiv.org
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
To address the problem of visual component overfitting in autonomous underwater/air vehicles (AUVs/UAVs) within simulation environments and consequent performance degradation when deployed in real-world settings, this paper introduces neural radiance fields (NeRF) for perception test data generationโ€”its first application in this domain. Leveraging differentiable rendering and camera pose perturbations, our method synthesizes high-fidelity, diverse image datasets that explicitly bridge the simulation-to-reality gap. Integrated into a metamorphic testing framework, it enables targeted robustness evaluation of vision-based modules such as vSLAM and object detection. Evaluation across eight representative visual components demonstrates significantly improved defect detection rates; generated images outperform baselines in both photorealism and diversity metrics. Our core contribution is the establishment of a NeRF-driven perception testing paradigm for autonomous systems, forming a closed loop from scene modeling and pose perturbation to robustness assessment.

Technology Category

Application Category

๐Ÿ“ Abstract
Autonomous inspection of infrastructure on land and in water is a quickly growing market, with applications including surveying constructions, monitoring plants, and tracking environmental changes in on- and off-shore wind energy farms. For Autonomous Underwater Vehicles and Unmanned Aerial Vehicles overfitting of controllers to simulation conditions fundamentally leads to poor performance in the operation environment. There is a pressing need for more diverse and realistic test data that accurately represents the challenges faced by these systems. We address the challenge of generating perception test data for autonomous systems by leveraging Neural Radiance Fields to generate realistic and diverse test images, and integrating them into a metamorphic testing framework for vision components such as vSLAM and object detection. Our tool, N2R-Tester, allows training models of custom scenes and rendering test images from perturbed positions. An experimental evaluation of N2R-Tester on eight different vision components in AUVs and UAVs demonstrates the efficacy and versatility of the approach.
Problem

Research questions and friction points this paper is trying to address.

Generating realistic test images for autonomous systems
Addressing overfitting in simulation-trained controllers
Enhancing vision components like vSLAM and object detection
Innovation

Methods, ideas, or system contributions that make the work stand out.

Leveraging Neural Radiance Fields for test images
Integrating images into metamorphic testing framework
Rendering test images from perturbed positions
๐Ÿ”Ž Similar Papers
No similar papers found.
L
Laura Weihl
IT University of Copenhagen, Copenhagen, Denmark
Bilal Wehbe
Bilal Wehbe
Phd, German Research Center for Artificial Intelligence - Robotic Innovation Center
Underwater RoboticsModeling and DynamicsArtificial IntelligenceMachine learning
A
Andrzej Wkasowski
IT University of Copenhagen, Copenhagen, Denmark