Flight Dynamics to Sensing Modalities: Exploiting Drone Ground Effect for Accurate Edge Detection

📅 2025-09-25
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Conventional radar- or vision-based edge detection for lightweight UAVs incurs high hardware costs and excessive computational overhead. Method: This paper pioneers the repurposing of ground effect—from a flight disturbance into a novel sensing modality—enabling environmental boundary detection using only inertial measurement unit (IMU) data and flight control commands, without additional sensors. A theoretical model characterizes the coupling among ground effect, altitude, attitude, and control inputs; based on this, a lightweight feature extraction and boundary discrimination algorithm is designed. Results: Experiments demonstrate an average boundary detection distance error of only 0.051 m—86% lower than baseline methods—and power consumption as low as 43 mW, substantially outperforming vision-based approaches. The method is sensor-efficient, computationally frugal, and particularly valuable in resource-constrained applications such as post-disaster search-and-rescue and autonomous navigation.

Technology Category

Application Category

📝 Abstract
Drone-based rapid and accurate environmental edge detection is highly advantageous for tasks such as disaster relief and autonomous navigation. Current methods, using radars or cameras, raise deployment costs and burden lightweight drones with high computational demands. In this paper, we propose AirTouch, a system that transforms the ground effect from a stability "foe" in traditional flight control views, into a "friend" for accurate and efficient edge detection. Our key insight is that analyzing drone basic attitude sensor readings and flight commands allows us to detect ground effect changes. Such changes typically indicate the drone flying over a boundary of two materials, making this information valuable for edge detection. We approach this insight through theoretical analysis, algorithm design, and implementation, fully leveraging the ground effect as a new sensing modality without compromising drone flight stability, thereby achieving accurate and efficient scene edge detection. We also compare this new sensing modality with vision-based methods to clarify its exclusive advantages in resource efficiency and detection capability. Extensive evaluations demonstrate that our system achieves a high detection accuracy with mean detection distance errors of 0.051m, outperforming the baseline method performance by 86%. With such detection performance, our system requires only 43 mW power consumption, contributing to this new sensing modality for low-cost and highly efficient edge detection.
Problem

Research questions and friction points this paper is trying to address.

Transforming drone ground effect from stability foe to detection friend
Detecting environmental edges using drone attitude sensors and flight commands
Achieving accurate edge detection without expensive sensors or high computation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Exploits drone ground effect as sensing modality
Analyzes attitude sensor readings for edge detection
Achieves high accuracy with minimal power consumption
🔎 Similar Papers
No similar papers found.
Chenyu Zhao
Chenyu Zhao
Imperial College London, Tsinghua University
Mobile RoboticsAIoTSensing ModalityEmbedded AIRobotics and Quadrotors
Jingao Xu
Jingao Xu
Incoming Assistant Professor, The University of Hong Kong; CMU
Mobile ComputingInternet of ThingsEdge ComputingEmbodied Navigation
C
Ciyu Ruan
Shenzhen International Graduate School, Tsinghua University, China
H
Haoyang Wang
Shenzhen International Graduate School, Tsinghua University, China
S
Shengbo Wang
Shenzhen International Graduate School, Tsinghua University, China
J
Jiaqi Li
Shenzhen International Graduate School, Tsinghua University, China
J
Jirong Zha
Shenzhen International Graduate School, Tsinghua University, China
W
Weijie Hong
Shenzhen Smart City Communication Co., Ltd., China
Z
Zheng Yang
School of Software, Tsinghua University, Beijing 100084, China
Yunhao Liu
Yunhao Liu
ACM Fellow, IEEE Fellow, CCF Fellow, Tsinghua University
Wireless Sensor Networks/RFIDCyber Physical Systems and IoTPrivacy and SecurityCloud Computing
X
Xiao-Ping Zhang
Shenzhen International Graduate School, Tsinghua University, China
X
Xinlei Chen
Shenzhen International Graduate School, Tsinghua University, China