🤖 AI Summary
This work addresses the frequent failure of existing time-optimal trajectory planners in high-speed flight, which often neglect the coupling among vehicle dynamics, environmental geometry, and visual state estimation—leading to perceptual degradation. We propose the first unified framework that explicitly integrates information-theoretic position uncertainty, camera field-of-view constraints, and gate geometry into time-optimal trajectory optimization. Our approach jointly optimizes speed and perception reliability under full nonlinear dynamics, actuator limits, and aerodynamic effects. By combining nonlinear trajectory optimization, convex geometric gate modeling, and a model predictive controller that decouples lateral and progress errors, we achieve 100% closed-loop success (up from 55%) on the Split-S track at 9.8 m/s, with an average tracking error of only 0.07 m, establishing a scalable benchmark for perception-aware high-speed autonomous flight.
📝 Abstract
Agile quadrotor flight pushes the limits of control, actuation, and onboard perception. While time-optimal trajectory planning has been extensively studied, existing approaches typically neglect the tight coupling between vehicle dynamics, environmental geometry, and the visual requirements of onboard state estimation. As a result, trajectories that are dynamically feasible may fail in closed-loop execution due to degraded visual quality. This paper introduces a unified time-optimal trajectory optimization framework for vision-based quadrotors that explicitly incorporates perception constraints alongside full nonlinear dynamics, rotor actuation limits, aerodynamic effects, camera field-of-view constraints, and convex geometric gate representations.
The proposed formulation solves minimum-time lap trajectories for arbitrary racetracks with diverse gate shapes and orientations, while remaining numerically robust and computationally efficient. We derive an information-theoretic position uncertainty metric to quantify visual state-estimation quality and integrate it into the planner through three perception objectives: position uncertainty minimization, sequential field-of-view constraints, and look-ahead alignment. This enables systematic exploration of the trade-offs between speed and perceptual reliability.
To accurately track the resulting perception-aware trajectories, we develop a model predictive contouring tracking controller that separates lateral and progress errors. Experiments demonstrate real-world flight speeds up to 9.8 m/s with 0.07 m average tracking error, and closed-loop success rates improved from 55% to 100% on a challenging Split-S course. The proposed system provides a scalable benchmark for studying the fundamental limits of perception-aware, time-optimal autonomous flight.