🤖 AI Summary
This paper addresses the high latency and poor robustness of conventional optical marker systems in dynamic scenes and challenging illumination conditions. To this end, it introduces, for the first time, the Event-Driven Optical Marker System (EBOMS) paradigm—a novel technical framework integrating event cameras with asynchronous optical markers (e.g., AprilTags, blinking LEDs). The proposed end-to-end perception architecture leverages asynchronous signal processing and robust feature extraction to achieve sub-millisecond detection latency, high-dynamic-range pose estimation, and strong illumination invariance. The work establishes a comprehensive EBOMS technology map, clarifies interdisciplinary boundaries, and identifies core challenges including real-time performance, synchronization, and embedded deployment. Empirically validated and theoretically grounded, the method provides a scalable framework and practical guidelines for robotic navigation, AR/VR interaction, and low-power edge vision applications.
📝 Abstract
The advent of event-based cameras, with their low latency, high dynamic range, and reduced power consumption, marked a significant change in robotic vision and machine perception. In particular, the combination of these neuromorphic sensors with widely-available passive or active optical markers (e.g. AprilTags, arrays of blinking LEDs), has recently opened up a wide field of possibilities. This survey paper provides a comprehensive review on Event-Based Optical Marker Systems (EBOMS). We analyze the basic principles and technologies on which these systems are based, with a special focus on their asynchronous operation and robustness against adverse lighting conditions. We also describe the most relevant applications of EBOMS, including object detection and tracking, pose estimation, and optical communication. The article concludes with a discussion of possible future research directions in this rapidly-emerging and multidisciplinary field.