Flying Co-Stereo: Enabling Long-Range Aerial Dense Mapping via Collaborative Stereo Vision of Dynamic-Baseline

📅 2025-05-31
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the limited long-range dense mapping capability of UAV swarms in large-scale unknown environments, this paper proposes a lightweight long-distance dense mapping method based on collaborative stereo vision with dynamic inter-UAV baselines. We innovatively design a cross-UAV dual-spectrum (visible-infrared) baseline estimation algorithm to robustly estimate dynamic baselines; develop a hybrid feature association strategy integrating deep learning-based inter-UAV feature matching and onboard optical flow tracking; and introduce a sparse-to-dense depth recovery framework that enhances metric accuracy via exponential depth fitting of long-baseline triangulated point clouds. Experiments demonstrate that the system achieves dense 3D mapping up to 70 m, with relative depth errors of 2.3%–9.7%. Compared to conventional fixed short-baseline stereo systems, the proposed method improves maximum depth range and coverage area by 350% and 450%, respectively.

Technology Category

Application Category

📝 Abstract
Lightweight long-range mapping is critical for safe navigation of UAV swarms in large-scale unknown environments. Traditional stereo vision systems with fixed short baselines face limited perception ranges. To address this, we propose Flying Co-Stereo, a cross-agent collaborative stereo vision system that leverages the wide-baseline spatial configuration of two UAVs for long-range dense mapping. Key innovations include: (1) a dual-spectrum visual-inertial-ranging estimator for robust baseline estimation; (2) a hybrid feature association strategy combining deep learning-based cross-agent matching and optical-flow-based intra-agent tracking; (3) A sparse-to-dense depth recovery scheme,refining dense monocular depth predictions using exponential fitting of long-range triangulated sparse landmarks for precise metric-scale mapping. Experiments demonstrate the Flying Co-Stereo system achieves dense 3D mapping up to 70 meters with 2.3%-9.7% relative error, outperforming conventional systems by up to 350% in depth range and 450% in coverage area. The project webpage: https://xingxingzuo.github.io/flying_co_stereo
Problem

Research questions and friction points this paper is trying to address.

Enabling long-range dense mapping for UAV swarms
Overcoming fixed short baseline limitations in stereo vision
Achieving precise metric-scale mapping via collaborative stereo
Innovation

Methods, ideas, or system contributions that make the work stand out.

Dual-spectrum estimator for baseline estimation
Hybrid feature association with deep learning
Sparse-to-dense depth recovery via exponential fitting
🔎 Similar Papers
No similar papers found.
Z
Zhaoying Wang
Shanghai Jiao Tong University, Shanghai 200240, China
Xingxing Zuo
Xingxing Zuo
Assistant Professor @MBZUAI
RoboticsState EstimationEmbodied AI
W
Wei Dong
Shanghai Jiao Tong University, Shanghai 200240, China