🤖 AI Summary
Long-standing limitations in event-driven eye-tracking—specifically, the absence of high-precision, high-frame-rate (200 Hz) eye-level ground-truth datasets—have impeded supervised learning and algorithm evaluation. To address this, we introduce the first near-eye eye-movement benchmark dataset specifically designed for event cameras. We propose a semi-automatic pupil annotation pipeline tailored for event streams, enabling efficient and robust sub-pixel localization of the pupil center with millisecond-level temporal alignment. Crucially, we release the first publicly available 200 Hz synchronized pairings of event streams and ground-truth pupil positions. This dataset substantially alleviates the data bottleneck in event-based supervised training and quantitative evaluation, providing a reliable foundation for deep learning model development, temporal modeling, and cross-modal alignment research. It advances event-driven eye-tracking toward higher accuracy and real-time performance.
📝 Abstract
Event-based eye tracking is a promising solution for efficient and low-power eye tracking in smart eyewear technologies. However, the novelty of event-based sensors has resulted in a limited number of available datasets, particularly those with eye-level annotations, crucial for algorithm validation and deep-learning training. This paper addresses this gap by presenting an improved version of a popular event-based eye-tracking dataset. We introduce a semi-automatic annotation pipeline specifically designed for event-based data annotation. Additionally, we provide the scientific community with the computed annotations for pupil detection at 200Hz.