Event6D: Event-based Novel Object 6D Pose Tracking

📅 2026-03-30
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of accurate 6D object pose estimation in high-speed dynamic scenes, where conventional RGB and depth-based methods suffer from motion blur and large displacements. The authors propose EventTrack6D, the first event-driven 6D pose tracking framework trained exclusively on synthetic data, which generalizes to real-world scenarios without object-specific fine-tuning. Leveraging the microsecond temporal resolution of event cameras, the method reconstructs scene observations at arbitrary timestamps through a dual reconstruction mechanism—recovering both intensity and depth—and employs depth-conditioned sparse-to-dense event densification. This enables high-frame-rate, temporally consistent tracking of unseen objects. The system operates at over 120 FPS and achieves high accuracy on both a newly introduced real-world dataset and a large-scale simulated benchmark. The code and datasets are publicly released.
📝 Abstract
Event cameras provide microsecond latency, making them suitable for 6D object pose tracking in fast, dynamic scenes where conventional RGB and depth pipelines suffer from motion blur and large pixel displacements. We introduce EventTrack6D, an event-depth tracking framework that generalizes to novel objects without object-specific training by reconstructing both intensity and depth at arbitrary timestamps between depth frames. Conditioned on the most recent depth measurement, our dual reconstruction recovers dense photometric and geometric cues from sparse event streams. Our EventTrack6D operates at over 120 FPS and maintains temporal consistency under rapid motion. To support training and evaluation, we introduce a comprehensive benchmark suite: a large-scale synthetic dataset for training and two complementary evaluation sets, including real and simulated event datasets. Trained exclusively on synthetic data, EventTrack6D generalizes effectively to real-world scenarios without fine-tuning, maintaining accurate tracking across diverse objects and motion patterns. Our method and datasets validate the effectiveness of event cameras for event-based 6D pose tracking of novel objects. Code and datasets are publicly available at https://chohoonhee.github.io/Event6D.
Problem

Research questions and friction points this paper is trying to address.

6D pose tracking
event cameras
novel objects
motion blur
dynamic scenes
Innovation

Methods, ideas, or system contributions that make the work stand out.

event camera
6D pose tracking
novel object generalization
intensity-depth reconstruction
synthetic-to-real transfer
🔎 Similar Papers
No similar papers found.