🤖 AI Summary
Conventional radar- or vision-based edge detection for lightweight UAVs incurs high hardware costs and excessive computational overhead. Method: This paper pioneers the repurposing of ground effect—from a flight disturbance into a novel sensing modality—enabling environmental boundary detection using only inertial measurement unit (IMU) data and flight control commands, without additional sensors. A theoretical model characterizes the coupling among ground effect, altitude, attitude, and control inputs; based on this, a lightweight feature extraction and boundary discrimination algorithm is designed. Results: Experiments demonstrate an average boundary detection distance error of only 0.051 m—86% lower than baseline methods—and power consumption as low as 43 mW, substantially outperforming vision-based approaches. The method is sensor-efficient, computationally frugal, and particularly valuable in resource-constrained applications such as post-disaster search-and-rescue and autonomous navigation.
📝 Abstract
Drone-based rapid and accurate environmental edge detection is highly advantageous for tasks such as disaster relief and autonomous navigation. Current methods, using radars or cameras, raise deployment costs and burden lightweight drones with high computational demands. In this paper, we propose AirTouch, a system that transforms the ground effect from a stability "foe" in traditional flight control views, into a "friend" for accurate and efficient edge detection. Our key insight is that analyzing drone basic attitude sensor readings and flight commands allows us to detect ground effect changes. Such changes typically indicate the drone flying over a boundary of two materials, making this information valuable for edge detection. We approach this insight through theoretical analysis, algorithm design, and implementation, fully leveraging the ground effect as a new sensing modality without compromising drone flight stability, thereby achieving accurate and efficient scene edge detection. We also compare this new sensing modality with vision-based methods to clarify its exclusive advantages in resource efficiency and detection capability. Extensive evaluations demonstrate that our system achieves a high detection accuracy with mean detection distance errors of 0.051m, outperforming the baseline method performance by 86%. With such detection performance, our system requires only 43 mW power consumption, contributing to this new sensing modality for low-cost and highly efficient edge detection.