๐ค AI Summary
This work addresses the challenge of real-time line segment detection and tracking in artificial environments using only a high-resolution event cameraโwithout any frame-based camera assistance. Methodologically, we propose the first purely event-driven solution, featuring a lattice-allocated processing pipeline, a velocity-invariant event representation, a geometric-fitting-score-based line detection mechanism, and an endpoint dynamic perturbation tracking algorithm, all accelerated via parallelization for real-time performance. Experiments on both proprietary and public benchmarks demonstrate millisecond-level end-to-end latency and significantly improved detection and tracking accuracy over state-of-the-art event-frame hybrid and pure-event approaches. To our knowledge, this is the first method achieving robust, standalone geometric feature perception under high-dynamic scenes using events alone.
๐ Abstract
Line segment extraction is effective for capturing geometric features of human-made environments. Event-based cameras, which asynchronously respond to contrast changes along edges, enable efficient extraction by reducing redundant data. However, recent methods often rely on additional frame cameras or struggle with high event rates. This research addresses real-time line segment detection and tracking using only a modern, high-resolution (i.e., high event rate) event-based camera. Our lattice-allocated pipeline consists of (i) velocity-invariant event representation, (ii) line segment detection based on a fitting score, (iii) and line segment tracking by perturbating endpoints. Evaluation using ad-hoc recorded dataset and public datasets demonstrates real-time performance and higher accuracy compared to state-of-the-art event-only and event-frame hybrid baselines, enabling fully stand-alone event camera operation in real-world settings.