Asynchronous Multi-Object Tracking with an Event Camera

📅 2025-05-12
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of real-time, multi-object tracking of small-scale targets in high-speed dynamic scenes—such as swarming bees—using event-based vision. We propose an end-to-end event-stream-driven framework: (1) a Field of Active Flow Directions (FAFD) representation for event-level motion direction modeling; (2) an Asynchronous Event Blob (AEB) tracker coupled with a Surface of Active Events (SAE) dynamic surface modeling scheme; and (3) a deep learning–based intensity patch classifier for detection validation, integrated with optical flow consistency constraints to jointly estimate position, velocity, size, and orientation. Evaluated on our newly introduced Bee Swarm dataset, the method achieves over 37% improvement in both precision and recall compared to state-of-the-art event-based trackers, enabling stable, real-time tracking of dozens of rapidly moving bees.

Technology Category

Application Category

📝 Abstract
Events cameras are ideal sensors for enabling robots to detect and track objects in highly dynamic environments due to their low latency output, high temporal resolution, and high dynamic range. In this paper, we present the Asynchronous Event Multi-Object Tracking (AEMOT) algorithm for detecting and tracking multiple objects by processing individual raw events asynchronously. AEMOT detects salient event blob features by identifying regions of consistent optical flow using a novel Field of Active Flow Directions built from the Surface of Active Events. Detected features are tracked as candidate objects using the recently proposed Asynchronous Event Blob (AEB) tracker in order to construct small intensity patches of each candidate object. A novel learnt validation stage promotes or discards candidate objects based on classification of their intensity patches, with promoted objects having their position, velocity, size, and orientation estimated at their event rate. We evaluate AEMOT on a new Bee Swarm Dataset, where it tracks dozens of small bees with precision and recall performance exceeding that of alternative event-based detection and tracking algorithms by over 37%. Source code and the labelled event Bee Swarm Dataset will be open sourced
Problem

Research questions and friction points this paper is trying to address.

Tracking multiple objects asynchronously with event cameras
Detecting salient features using optical flow regions
Validating candidate objects via learned classification
Innovation

Methods, ideas, or system contributions that make the work stand out.

Asynchronous processing of raw event data
Field of Active Flow Directions for feature detection
Learnt validation stage for object classification
🔎 Similar Papers
No similar papers found.
A
Angus Apps
Systems Theory and Robotics Group, Australian National University
Z
Ziwei Wang
Systems Theory and Robotics Group, Australian National University
V
Vladimir Perejogin
Defence Science and Technology Group
T
Timothy Molloy
Systems Theory and Robotics Group, Australian National University
Robert Mahony
Robert Mahony
Australian National University
Control systemsroboticsaerial roboticsoptimization