Real-Time Glass Detection and Reprojection using Sensor Fusion Onboard Aerial Robots

📅 2025-10-07
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Transparent obstacles—such as glass—lack texture and exhibit specular reflections, causing conventional depth sensors to fail and posing severe risks to navigation safety and mapping accuracy for low-SWaP (Size, Weight, and Power) UAVs. To address this, we propose a lightweight, real-time solution that fuses time-of-flight (ToF) camera and ultrasonic sensor data, coupled with an embedded-CPU-only 2D convolutional neural network for transparent obstacle detection, depth map inpainting, and reprojection-based mapping. To our knowledge, this is the first fully onboard, real-time (>15 Hz) transparent-object perception and mapping system deployed on a sub-300 g quadrotor, operating with low computational overhead (<40% CPU utilization). Evaluated across diverse indoor and outdoor scenarios, the method significantly improves obstacle avoidance robustness and map completeness, demonstrating both efficacy and practicality on resource-constrained aerial platforms.

Technology Category

Application Category

📝 Abstract
Autonomous aerial robots are increasingly being deployed in real-world scenarios, where transparent obstacles present significant challenges to reliable navigation and mapping. These materials pose a unique problem for traditional perception systems because they lack discernible features and can cause conventional depth sensors to fail, leading to inaccurate maps and potential collisions. To ensure safe navigation, robots must be able to accurately detect and map these transparent obstacles. Existing methods often rely on large, expensive sensors or algorithms that impose high computational burdens, making them unsuitable for low Size, Weight, and Power (SWaP) robots. In this work, we propose a novel and computationally efficient framework for detecting and mapping transparent obstacles onboard a sub-300g quadrotor. Our method fuses data from a Time-of-Flight (ToF) camera and an ultrasonic sensor with a custom, lightweight 2D convolution model. This specialized approach accurately detects specular reflections and propagates their depth into corresponding empty regions of the depth map, effectively rendering transparent obstacles visible. The entire pipeline operates in real-time, utilizing only a small fraction of a CPU core on an embedded processor. We validate our system through a series of experiments in both controlled and real-world environments, demonstrating the utility of our method through experiments where the robot maps indoor environments containing glass. Our work is, to our knowledge, the first of its kind to demonstrate a real-time, onboard transparent obstacle mapping system on a low-SWaP quadrotor using only the CPU.
Problem

Research questions and friction points this paper is trying to address.

Detect transparent obstacles for aerial robot navigation
Enable real-time onboard mapping with low computational cost
Fuse sensor data to overcome depth sensor limitations
Innovation

Methods, ideas, or system contributions that make the work stand out.

Fuses ToF camera and ultrasonic sensor data
Uses lightweight 2D convolution model
Real-time processing on embedded CPU