π€ AI Summary
This work addresses the challenge of multi-UAV cooperative navigation in GNSS-denied, complex outdoor environments such as dense forests by proposing a fully communication-free coordination approach. Leveraging onboard anisotropic 3D LiDAR, the method integrates SLAM, obstacle detection, and neighbor UAV perception into a perception-aware 3D navigation framework, enabling safe and efficient goal-reaching under limited field-of-view constraints. It represents the first real-world demonstration of GNSS- and communication-independent multi-UAV coordination in natural outdoor settings, thereby eliminating reliance on external positioning or communication infrastructure. Extensive simulations and field experiments validate the approachβs superior scalability, robustness, and reliability.
π Abstract
We present a communication-free method for safe multi-robot coordination in complex environments such as forests with dense canopy cover, where GNSS is unavailable. Our approach relies on an onboard anisotropic 3D LiDAR sensor used for SLAM as well as for detecting obstacles and neighboring robots. We develop a novel perception-aware 3D navigation framework that enables robots to safely and effectively progress toward a goal region despite limited sensor field-of-view. The approach is evaluated through extensive simulations across diverse scenarios and validated in real-world field experiments, demonstrating its scalability, robustness, and reliability.