🤖 AI Summary
This work addresses the significant challenge of enabling real-time object detection on nano-drones constrained by approximately 10 MiB of memory and 1 W of power consumption. The authors propose an adaptive object detection method tailored for such platforms, introducing— for the first time—an early-exit mechanism into dense prediction tasks on nano-drones. Their approach combines a lightweight MobileNet-SSD backbone with an auxiliary null-frame classifier that dynamically identifies frames containing no objects of interest and terminates inference early, thereby reducing average latency. Implemented on the Bitcraze Crazyflie 2.1 platform, the method achieves up to a 24% improvement in throughput while sacrificing only 0.015 mAP in accuracy, demonstrating an effective trade-off between computational efficiency and detection performance.
📝 Abstract
Deploying tiny computer vision Deep Neural Networks (DNNs) on-board nano-sized drones is key for achieving autonomy, but is complicated by the extremely tight constraints of their computational platforms (approximately 10 MiB memory, 1 W power budget). Early-exit adaptive DNNs that dial down the computational effort for "easy-to-process" input frames represent a promising way to reduce the average inference latency. However, while this approach is extensively studied for classification, its application to dense tasks like object detection (OD) is not straightforward. In this paper, we propose BlankSkip, an adaptive network for on-device OD that leverages a simple auxiliary classification task for early exit, i.e., identifying frames with no objects of interest. With experiments using a real-world nano-drone platform, the Bitcraze Crazyflie 2.1, we achieve up to 24% average throughput improvement with a limited 0.015 mean Average Precision (mAP) drop compared to a static MobileNet-SSD detector, on a state-of-the-art nano-drones OD dataset.