EV-Flying: an Event-based Dataset for In-The-Wild Recognition of Flying Objects

πŸ“… 2025-06-04
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
Conventional RGB cameras suffer from motion blur, scale variation, and insufficient robustness when monitoring fast-moving, small-scale, and unpredictably maneuvering aerial objects (e.g., insects, birds, drones). To address these limitations, this work proposes a novel event-camera-based paradigm for detection and identification of flying objects. We introduce EV-Flyingβ€”the first large-scale outdoor event dataset featuring spatiotemporal bounding boxes and trajectory IDs. We design a lightweight, point-cloud-driven event representation that directly encodes asynchronous event streams as spatiotemporal point clouds, eliminating frame-rate constraints. A PointNet-inspired architecture enables end-to-end fine-grained classification and multi-object tracking. Experiments demonstrate substantial improvements in accuracy, robustness, and real-time performance for small and highly agile targets. Our approach establishes a scalable, event-driven framework for dynamic aerial biological monitoring.

Technology Category

Application Category

πŸ“ Abstract
Monitoring aerial objects is crucial for security, wildlife conservation, and environmental studies. Traditional RGB-based approaches struggle with challenges such as scale variations, motion blur, and high-speed object movements, especially for small flying entities like insects and drones. In this work, we explore the potential of event-based vision for detecting and recognizing flying objects, in particular animals that may not follow short and long-term predictable patters. Event cameras offer high temporal resolution, low latency, and robustness to motion blur, making them well-suited for this task. We introduce EV-Flying, an event-based dataset of flying objects, comprising manually annotated birds, insects and drones with spatio-temporal bounding boxes and track identities. To effectively process the asynchronous event streams, we employ a point-based approach leveraging lightweight architectures inspired by PointNet. Our study investigates the classification of flying objects using point cloud-based event representations. The proposed dataset and methodology pave the way for more efficient and reliable aerial object recognition in real-world scenarios.
Problem

Research questions and friction points this paper is trying to address.

Detect and recognize flying objects using event-based vision
Address challenges like scale variations and motion blur
Classify flying objects with point cloud event representations
Innovation

Methods, ideas, or system contributions that make the work stand out.

Event-based vision for flying object detection
Point-based approach with lightweight PointNet
EV-Flying dataset with spatio-temporal annotations
πŸ”Ž Similar Papers
No similar papers found.
G
Gabriele Magrini
University of Florence
Federico Becattini
Federico Becattini
Tenure Track Assistant Professor (RTD-B), University of Siena
computer visionautonomous drivingtrajectory predictionfashion recommendationcultural heritage
G
Giovanni Colombo
University of Florence
P
Pietro Pala
University of Florence