PanoAir: A Panoramic Visual-Inertial SLAM with Cross-Time Real-World UAV Dataset

📅 2026-04-01
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the limitations of existing visual-inertial SLAM methods, which suffer from drift or failure in complex UAV scenarios due to their reliance on limited field-of-view cameras. To overcome this, we present the first panoramic visual-inertial dataset tailored for UAV applications and introduce a novel SLAM framework specifically designed for omnidirectional imagery. By incorporating full-sphere feature extraction and a dedicated loop-closure detection mechanism, our approach significantly enhances pose estimation accuracy, robustness, and global consistency. Extensive experiments demonstrate that the proposed method outperforms state-of-the-art alternatives on both our newly released dataset and existing public benchmarks. Furthermore, it achieves computational efficiency on embedded platforms comparable to that of desktop systems, enabling practical deployment in resource-constrained UAV environments.
📝 Abstract
Accurate pose estimation is fundamental for unmanned aerial vehicle (UAV) applications, where Visual-Inertial SLAM (VI-SLAM) provides a cost-effective solution for localization and mapping. However, existing VI-SLAM methods mainly rely on sensors with limited fields of view (FoV), which can lead to drift and even failure in complex UAV scenarios. Although panoramic cameras provide omnidirectional perception to improve robustness, panoramic VI-SLAM and corresponding real-world datasets for UAVs remain underexplored. To address this limitation, we first construct a real-world panoramic visual-inertial dataset covering diverse flight conditions, including varying illumination, altitudes, trajectory lengths, and motion dynamics. To achieve accurate and robust pose estimation under such challenging UAV scenarios, we propose a panoramic VI-SLAM framework that exploits the omnidirectional FoV via the proposed panoramic feature extraction and panoramic loop closure, enhancing feature constraints and ensuring global consistency. Extensive experiments on both the proposed dataset and public benchmarks demonstrate that our method achieves superior accuracy, robustness, and consistency compared to existing approaches. Moreover, deployment on embedded platform validates its practical applicability, achieving comparable computational efficiency to PC implementations. The source code and dataset are publicly available at https://drive.google.com/file/d/1lG1Upn6yi-N6tYpEHAt6dfR1uhzNtWbT/view
Problem

Research questions and friction points this paper is trying to address.

Visual-Inertial SLAM
panoramic camera
UAV
limited field of view
real-world dataset
Innovation

Methods, ideas, or system contributions that make the work stand out.

panoramic VI-SLAM
omnidirectional perception
UAV dataset
loop closure
embedded deployment
🔎 Similar Papers
No similar papers found.
Y
Yiyang Wu
School of Aeronautics and Astronautics, Sun Yat-sen University, Guangzhou 510275, China
Xiaohu Zhang
Xiaohu Zhang
The University of Hong Kong
Urban TechnologyTransport Geography
Y
Yanjin Du
School of Aeronautics and Astronautics, Sun Yat-sen University, Guangzhou 510275, China
T
Tongsu Zhang
School of Aeronautics and Astronautics, Sun Yat-sen University, Guangzhou 510275, China
Chujun Li
Chujun Li
School of Aeronautics and Astronautics, Sun Yat sen University
6D pose estimation6D pose trackingAnti-UAVsMultimodel
Siyang Chen
Siyang Chen
School of Aeronautics and Astronautics, Sun Yat-sen University
Space debris detection
G
Guoyi Zhang
School of Aeronautics and Astronautics, Sun Yat-sen University, Guangzhou 510275, China
Xiangpeng Xu
Xiangpeng Xu
Sun Yat-sen University
UAV vision perceptionpose estimation