🤖 AI Summary
Autonomous drone racing in dynamic, unknown environments is limited by reliance on pre-built maps, visual-inertial odometry (VIO), or static assumptions. Method: This paper proposes a model-free, closed-loop control method using only monocular visual observations—specifically, the line-of-sight (LOS) angle to gates—within a vision-based servoing framework. Crucially, it introduces proportional navigation (PN) for the first time into a map-free, VIO-free setting and derives a closed-form optimal control law that explicitly respects state and input constraints. The approach requires no pose estimation, environmental priors, or kinematic modeling, generating real-time control commands directly from the LOS angle and its rate of change. Results: Simulation and real-world experiments demonstrate stable gate traversal at high speeds (>8 m/s) along diverse motion trajectories. The method exhibits strong robustness against wind disturbances, model uncertainties, actuation delays, and dynamically moving gates, significantly enhancing the applicability of autonomous racing in open, unstructured scenarios such as disaster response.
📝 Abstract
Autonomous drone racing requires powerful perception, planning, and control and has become a benchmark and test field for autonomous, agile flight. Existing work usually assumes static race tracks with known maps, which enables offline planning of time-optimal trajectories, performing localization to the gates to reduce the drift in visual-inertial odometry (VIO) for state estimation or training learning-based methods for the particular race track and operating environment. In contrast, many real-world tasks like disaster response or delivery need to be performed in unknown and dynamic environments. To close this gap and make drone racing more robust against unseen environments and moving gates, we propose a control algorithm that does not require a race track map or VIO and uses only monocular measurements of the line of sight (LOS) to the gates. For this purpose, we adopt the law of proportional navigation (PN) to accurately fly through the gates despite gate motions or wind. We formulate the PN-informed vision-based control problem for drone racing as a constrained optimization problem and derive a closed-form optimal solution. We demonstrate through extensive simulations and real-world experiments that our method can navigate through moving gates at high speeds while being robust to different gate movements, model errors, wind, and delays.