🤖 AI Summary
To address trajectory drift in high-speed autonomous racing drones equipped with a monocular camera and low-grade IMU—common in A2RL/DCL competitions—this paper proposes a perception-localization-planning closed-loop coupling framework. First, YOLOv8 enables real-time detection of race gate frames; these visual measurements drive global drift correction within a vision-inertial odometry (VIO) system via an extended Kalman filter (EKF). Second, a perception-aware nonlinear model predictive control (MPC) trajectory planner dynamically balances gate visibility and motion performance during aggressive maneuvers. The entire framework is deployed on an embedded real-time platform. Experiments demonstrate significantly improved localization robustness over prolonged high-speed flights: third place in the AI Grand Challenge (peak speed 43.2 km/h), and second place in both the AI Drag Race and AI Multi-Drone Race (speeds exceeding 59 km/h), validating the feasibility of high-accuracy, high-reliability autonomous racing under minimal sensing.
📝 Abstract
The Abu Dhabi Autonomous Racing League(A2RL) x Drone Champions League competition(DCL) requires teams to perform high-speed autonomous drone racing using only a single camera and a low-quality inertial measurement unit -- a minimal sensor set that mirrors expert human drone racing pilots. This sensor limitation makes the system susceptible to drift from Visual-Inertial Odometry (VIO), particularly during long and fast flights with aggressive maneuvers. This paper presents the system developed for the championship, which achieved a competitive performance. Our approach corrected VIO drift by fusing its output with global position measurements derived from a YOLO-based gate detector using a Kalman filter. A perception-aware planner generated trajectories that balance speed with the need to keep gates visible for the perception system. The system demonstrated high performance, securing podium finishes across multiple categories: third place in the AI Grand Challenge with top speed of 43.2 km/h, second place in the AI Drag Race with over 59 km/h, and second place in the AI Multi-Drone Race. We detail the complete architecture and present a performance analysis based on experimental data from the competition, contributing our insights on building a successful system for monocular vision-based autonomous drone flight.