🤖 AI Summary
Existing event-stream trackers are predominantly evaluated on short-term datasets, failing to reflect real-world long-term tracking requirements. To address this, we introduce FELT—the first large-scale, long-term frame-event single-object tracking benchmark—comprising 742 videos and 1.59 million frame-event pairs. We further propose the Association Memory Transformer (AMT), the first architecture to integrate modern Hopfield layers into a Transformer backbone, explicitly modeling long-range temporal dependencies across frames and enabling robust, synergistic fusion of RGB and sparse event streams. AMT achieves state-of-the-art performance on FELT, RGBT234, LasHeR, and DepthTrack, outperforming 15 baseline methods. Both the AMT code and the FELT dataset are publicly released, establishing a new benchmark and methodological paradigm for long-term heterogeneous visual tracking.
📝 Abstract
Current event-/frame-event based trackers undergo evaluation on short-term tracking datasets, however, the tracking of real-world scenarios involves long-term tracking, and the performance of existing tracking algorithms in these scenarios remains unclear. In this paper, we first propose a new long-term and large-scale frame-event single object tracking dataset, termed FELT. It contains 742 videos and 1,594,474 RGB frames and event stream pairs and has become the largest frame-event tracking dataset to date. We re-train and evaluate 15 baseline trackers on our dataset for future works to compare. More importantly, we find that the RGB frames and event streams are naturally incomplete due to the influence of challenging factors and spatially sparse event flow. In response to this, we propose a novel associative memory Transformer network as a unified backbone by introducing modern Hopfield layers into multi-head self-attention blocks to fuse both RGB and event data. Extensive experiments on RGB-Event (FELT), RGB-Thermal (RGBT234, LasHeR), and RGB-Depth (DepthTrack) datasets fully validated the effectiveness of our model. The dataset and source code can be found at url{https://github.com/Event-AHU/FELT_SOT_Benchmark}.