🤖 AI Summary
This work addresses the limitations of existing indoor drone inspection platforms, which often lack compactness, active perception capabilities, and reproducibility between simulation and real-world deployment. The authors propose an open-source, co-designed hardware-software platform integrating a hardware-synchronized active perception system, onboard GPU-accelerated computation, a containerized ROS 2 autonomy stack, and a high-fidelity, photorealistic digital twin. This integration enables, for the first time, zero-shot component transfer between simulation and physical deployment for compact indoor inspection drones. The platform demonstrates onboard perception-driven trajectory tracking and autonomous exploration in industrial environments. All hardware and software designs are openly released, establishing a comprehensive and reproducible ecosystem for future research in autonomous aerial inspection.
📝 Abstract
Autonomous indoor flight for critical asset inspection presents fundamental challenges in perception, planning, control, and learning. Despite rapid progress, there is still a lack of a compact, active-sensing, open-source platform that is reproducible across simulation and real-world operation. To address this gap, we present Agipix, a co-designed open hardware and software platform for indoor aerial autonomy and critical asset inspection. Agipix features a compact, hardware-synchronized active-sensing platform with onboard GPU-accelerated compute that is capable of agile flight; a containerized ROS~2-based modular autonomy stack; and a photorealistic digital twin of the hardware platform together with a reliable UI. These elements enable rapid iteration via zero-shot transfer of containerized autonomy components between simulation and real flights. We demonstrate trajectory tracking and exploration performance using onboard sensing in industrial indoor environments. All hardware designs, simulation assets, and containerized software are released openly together with documentation.