AirV2X: Unified Air-Ground Vehicle-to-Everything Collaboration

📅 2025-06-23
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the high deployment cost of conventional vehicle-infrastructure cooperative systems and the prevalence of “uncovered hazardous zones” in rural and urban-fringe areas, this paper proposes a UAV-assisted aerial-ground integrated V2X collaboration framework. In this framework, UAVs serve as dynamic, reconfigurable aerial road-side units (a-RSUs), providing overhead perception to mitigate ground-level occlusions and enabling adaptive patrol and escort navigation—thereby significantly reducing reliance on fixed infrastructure. We introduce AirV2X-Perception, a large-scale UAV-assisted driving dataset comprising 6.73 hours of multi-scenario, multi-weather annotated footage, filling a critical gap in aerial-assisted autonomous driving perception evaluation. Complementing the dataset, we open-source a comprehensive toolchain supporting multi-view perception, dynamic UAV-vehicle localization, and synchronized data acquisition—facilitating V2D (vehicle-to-drone) algorithm development and standardized benchmarking, and accelerating the practical deployment of aerial-assisted autonomous driving technologies.

Technology Category

Application Category

📝 Abstract
While multi-vehicular collaborative driving demonstrates clear advantages over single-vehicle autonomy, traditional infrastructure-based V2X systems remain constrained by substantial deployment costs and the creation of "uncovered danger zones" in rural and suburban areas. We present AirV2X-Perception, a large-scale dataset that leverages Unmanned Aerial Vehicles (UAVs) as a flexible alternative or complement to fixed Road-Side Units (RSUs). Drones offer unique advantages over ground-based perception: complementary bird's-eye-views that reduce occlusions, dynamic positioning capabilities that enable hovering, patrolling, and escorting navigation rules, and significantly lower deployment costs compared to fixed infrastructure. Our dataset comprises 6.73 hours of drone-assisted driving scenarios across urban, suburban, and rural environments with varied weather and lighting conditions. The AirV2X-Perception dataset facilitates the development and standardized evaluation of Vehicle-to-Drone (V2D) algorithms, addressing a critical gap in the rapidly expanding field of aerial-assisted autonomous driving systems. The dataset and development kits are open-sourced at https://github.com/taco-group/AirV2X-Perception.
Problem

Research questions and friction points this paper is trying to address.

Addresses limitations of traditional V2X systems with high costs and coverage gaps
Introduces UAVs as flexible alternatives to fixed roadside infrastructure for V2X
Provides dataset for developing Vehicle-to-Drone algorithms in autonomous driving
Innovation

Methods, ideas, or system contributions that make the work stand out.

UAVs as flexible V2X infrastructure alternative
Bird's-eye-views reduce occlusions dynamically
Open-source drone-assisted driving dataset
🔎 Similar Papers
No similar papers found.