Vision-only UAV State Estimation for Fast Flights Without External Localization Systems: A2RL Drone Racing Finalist Approach

📅 2026-02-02
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of high-precision state estimation for high-speed drones operating in GNSS-denied environments with dense obstacles. The authors propose a monocular visual-inertial odometry (VIO) approach that fuses a single RGB camera and an IMU, incorporating an onboard landmark measurement scheme and a novel full-state mathematical drift model. This method enables, for the first time, online joint drift compensation for position, orientation, linear velocity, and angular velocity using only lightweight sensors. Extensive validation—including 1,600 simulations and numerous real-world flight tests—demonstrates the system’s robustness and accuracy under highly dynamic conditions. The approach secured a top-four finish among 210 teams in the A2RL 2025 Drone Racing Challenge, underscoring its practical effectiveness and reliability.

Technology Category

Application Category

📝 Abstract
Fast flights with aggressive maneuvers in cluttered GNSS-denied environments require fast, reliable, and accurate UAV state estimation. In this paper, we present an approach for onboard state estimation of a high-speed UAV using a monocular RGB camera and an IMU. Our approach fuses data from Visual-Inertial Odometry (VIO), an onboard landmark-based camera measurement system, and an IMU to produce an accurate state estimate. Using onboard measurement data, we estimate and compensate for VIO drift through a novel mathematical drift model. State-of-the-art approaches often rely on more complex hardware (e.g., stereo cameras or rangefinders) and use uncorrected drifting VIO velocities, orientation, and angular rates, leading to errors during fast maneuvers. In contrast, our method corrects all VIO states (position, orientation, linear and angular velocity), resulting in accurate state estimation even during rapid and dynamic motion. Our approach was thoroughly validated through 1600 simulations and numerous real-world experiments. Furthermore, we applied the proposed method in the A2RL Drone Racing Challenge 2025, where our team advanced to the final four out of 210 teams and earned a medal.
Problem

Research questions and friction points this paper is trying to address.

UAV state estimation
GNSS-denied environments
fast flight
monocular vision
Visual-Inertial Odometry
Innovation

Methods, ideas, or system contributions that make the work stand out.

Visual-Inertial Odometry
drift compensation
monocular vision
UAV state estimation
onboard localization
🔎 Similar Papers
No similar papers found.
Filip Novák
Filip Novák
Czech Technical University in Prague
M
Matvej Petrlík
Department of Cybernetics, Faculty of Electrical Engineering, Czech Technical University in Prague, Czech Republic
Matej Novosad
Matej Novosad
Ph.D. Student, Multi-robot systems Group, Czech Technical University in Prague
aerial roboticsmobile roboticsmotion planningmulti-goal planing
Parakh M. Gupta
Parakh M. Gupta
PhD Student, MRS, CTU-Prague
aerial roboticsmodular roboticsrobot controlmarine roboticsmodular systems
R
Robert Pěnička
Department of Cybernetics, Faculty of Electrical Engineering, Czech Technical University in Prague, Czech Republic
M
M. Saska
Department of Cybernetics, Faculty of Electrical Engineering, Czech Technical University in Prague, Czech Republic