aerial-autonomy-stack -- a Faster-than-real-time, Autopilot-agnostic, ROS2 Framework to Simulate and Deploy Perception-based Drones

📅 2026-02-06
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work proposes an end-to-end, open-source ROS 2 framework to address key challenges in autonomous drone development, including the sim-to-real deployment gap, complex integration of heterogeneous software and hardware, and prolonged iteration cycles. The framework integrates GPU-accelerated perception modules, a high-fidelity simulation environment, edge computing capabilities, and network communication modeling, while providing unified support for both PX4 and ArduPilot—the two dominant autopilot systems. It achieves, for the first time, full-stack simulation running over 20× faster than real time, significantly accelerating the build–test–deploy cycle for perception-driven drone systems and thereby enhancing both development efficiency and deployment reliability.

Technology Category

Application Category

📝 Abstract
Unmanned aerial vehicles are rapidly transforming multiple applications, from agricultural and infrastructure monitoring to logistics and defense. Introducing greater autonomy to these systems can simultaneously make them more effective as well as reliable. Thus, the ability to rapidly engineer and deploy autonomous aerial systems has become of strategic importance. In the 2010s, a combination of high-performance compute, data, and open-source software led to the current deep learning and AI boom, unlocking decades of prior theoretical work. Robotics is on the cusp of a similar transformation. However, physical AI faces unique hurdles, often combined under the umbrella term"simulation-to-reality gap". These span from modeling shortcomings to the complexity of vertically integrating the highly heterogeneous hardware and software systems typically found in field robots. To address the latter, we introduce aerial-autonomy-stack, an open-source, end-to-end framework designed to streamline the pipeline from (GPU-accelerated) perception to (flight controller-based) action. Our stack allows the development of aerial autonomy using ROS2 and provides a common interface for two of the most popular autopilots: PX4 and ArduPilot. We show that it supports over 20x faster-than-real-time, end-to-end simulation of a complete development and deployment stack -- including edge compute and networking -- significantly compressing the build-test-release cycle of perception-based autonomy.
Problem

Research questions and friction points this paper is trying to address.

simulation-to-reality gap
autonomous aerial systems
perception-based drones
heterogeneous integration
rapid deployment
Innovation

Methods, ideas, or system contributions that make the work stand out.

faster-than-real-time simulation
autopilot-agnostic
ROS2
perception-based autonomy
simulation-to-reality gap
🔎 Similar Papers
No similar papers found.
Jacopo Panerati
Jacopo Panerati
University of Toronto
AerospaceAutonomous SystemsOpen SourceReinforcement LearningRobotics
Sina Sajjadi
Sina Sajjadi
Complexity Science Hub - IT:U Interdisciplinary Transformation University Austria
Networks-Statistical Inference
S
Sina Soleymanpour
National Research Council Canada
V
Varunkumar Mehta
National Research Council Canada
I
Iraj Mantegh
National Research Council Canada