Drift-Corrected Monocular VIO and Perception-Aware Planning for Autonomous Drone Racing

📅 2025-12-23
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address trajectory drift in high-speed autonomous racing drones equipped with a monocular camera and low-grade IMU—common in A2RL/DCL competitions—this paper proposes a perception-localization-planning closed-loop coupling framework. First, YOLOv8 enables real-time detection of race gate frames; these visual measurements drive global drift correction within a vision-inertial odometry (VIO) system via an extended Kalman filter (EKF). Second, a perception-aware nonlinear model predictive control (MPC) trajectory planner dynamically balances gate visibility and motion performance during aggressive maneuvers. The entire framework is deployed on an embedded real-time platform. Experiments demonstrate significantly improved localization robustness over prolonged high-speed flights: third place in the AI Grand Challenge (peak speed 43.2 km/h), and second place in both the AI Drag Race and AI Multi-Drone Race (speeds exceeding 59 km/h), validating the feasibility of high-accuracy, high-reliability autonomous racing under minimal sensing.

Technology Category

Application Category

📝 Abstract
The Abu Dhabi Autonomous Racing League(A2RL) x Drone Champions League competition(DCL) requires teams to perform high-speed autonomous drone racing using only a single camera and a low-quality inertial measurement unit -- a minimal sensor set that mirrors expert human drone racing pilots. This sensor limitation makes the system susceptible to drift from Visual-Inertial Odometry (VIO), particularly during long and fast flights with aggressive maneuvers. This paper presents the system developed for the championship, which achieved a competitive performance. Our approach corrected VIO drift by fusing its output with global position measurements derived from a YOLO-based gate detector using a Kalman filter. A perception-aware planner generated trajectories that balance speed with the need to keep gates visible for the perception system. The system demonstrated high performance, securing podium finishes across multiple categories: third place in the AI Grand Challenge with top speed of 43.2 km/h, second place in the AI Drag Race with over 59 km/h, and second place in the AI Multi-Drone Race. We detail the complete architecture and present a performance analysis based on experimental data from the competition, contributing our insights on building a successful system for monocular vision-based autonomous drone flight.
Problem

Research questions and friction points this paper is trying to address.

Corrects VIO drift using Kalman filter with gate detection
Balances speed and gate visibility via perception-aware planning
Enables high-speed autonomous drone racing with minimal sensors
Innovation

Methods, ideas, or system contributions that make the work stand out.

Fusing VIO with YOLO-based gate detection via Kalman filter
Using perception-aware planner to balance speed and gate visibility
Correcting drift in monocular VIO for high-speed autonomous drone racing
🔎 Similar Papers
No similar papers found.
Maulana Bisyir Azhari
Maulana Bisyir Azhari
Korea Advanced Institute of Science and Technology
RoboticsComputer VisionSLAM
D
Donghun Han
Department of Electrical Engineering, Korea Advanced Institute of Science and Technology (KAIST), Daejeon, 34141, South Korea
J
Je In You
Department of Electrical Engineering, Korea Advanced Institute of Science and Technology (KAIST), Daejeon, 34141, South Korea
S
Sungjun Park
Department of Electrical Engineering, Korea Advanced Institute of Science and Technology (KAIST), Daejeon, 34141, South Korea
David Hyunchul Shim
David Hyunchul Shim
Professor, School of Electrical Engineering, Director, Korea RPAS Research Center, KAIST
Unmanned SystemsRobotic Systems