FRED: The Florence RGB-Event Drone Dataset

📅 2025-06-05
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the limitations of RGB-camera-based perception for small, high-speed UAVs under complex illumination conditions, this paper introduces FRED—the first RGB-event dual-modal benchmark specifically designed for high-speed UAV perception. FRED comprises 7 hours of densely annotated trajectories across five UAV models and challenging scenarios including rain and low-light conditions. It features the first synchronized acquisition and precise spatiotemporal alignment of high-temporal-resolution event streams with RGB video, augmented by multi-view calibration and manually refined 3D trajectory annotations. FRED bridges critical gaps in fine-grained temporal annotation and realistic high-speed flight motion modeling, supporting detection, tracking, and trajectory prediction tasks. Its standardized evaluation protocol significantly improves small-object perception performance under dynamic lighting. FRED has already facilitated validation of multiple state-of-the-art models.

Technology Category

Application Category

📝 Abstract
Small, fast, and lightweight drones present significant challenges for traditional RGB cameras due to their limitations in capturing fast-moving objects, especially under challenging lighting conditions. Event cameras offer an ideal solution, providing high temporal definition and dynamic range, yet existing benchmarks often lack fine temporal resolution or drone-specific motion patterns, hindering progress in these areas. This paper introduces the Florence RGB-Event Drone dataset (FRED), a novel multimodal dataset specifically designed for drone detection, tracking, and trajectory forecasting, combining RGB video and event streams. FRED features more than 7 hours of densely annotated drone trajectories, using 5 different drone models and including challenging scenarios such as rain and adverse lighting conditions. We provide detailed evaluation protocols and standard metrics for each task, facilitating reproducible benchmarking. The authors hope FRED will advance research in high-speed drone perception and multimodal spatiotemporal understanding.
Problem

Research questions and friction points this paper is trying to address.

Challenges in capturing fast-moving drones with RGB cameras
Lack of drone-specific motion patterns in existing benchmarks
Need for multimodal datasets for drone detection and tracking
Innovation

Methods, ideas, or system contributions that make the work stand out.

Combines RGB video and event streams
Includes diverse drone models and scenarios
Provides detailed evaluation protocols
🔎 Similar Papers
No similar papers found.