🤖 AI Summary
To address the high cost and hardware complexity associated with multi-sensor fusion in autonomous driving, this paper proposes a passive monocular distance estimation method leveraging a single event camera. It employs roadside LED light strips as active auxiliary illumination and applies the Phase-Only Correlation (POC) algorithm to asynchronously generated event streams, enabling sub-pixel-level spot displacement detection and real-time monocular triangulation—without requiring stereo vision or additional sensors such as LiDAR or radar. This work constitutes the first application of POC to single-event-camera-based depth estimation, introducing a novel spatiotemporal event stream phase representation model. Real-vehicle experiments demonstrate >90% localization success rate and mean absolute error <0.5 m within a 20–60 m range, robustly validating the feasibility and practicality of high-accuracy, low-cost ranging using event cameras in complex outdoor lighting conditions.
📝 Abstract
With the growing adoption of autonomous driving, the advancement of sensor technology is crucial for ensuring safety and reliable operation. Sensor fusion techniques that combine multiple sensors such as LiDAR, radar, and cameras have proven effective, but the integration of multiple devices increases both hardware complexity and cost. Therefore, developing a single sensor capable of performing multiple roles is highly desirable for cost-efficient and scalable autonomous driving systems. Event cameras have emerged as a promising solution due to their unique characteristics, including high dynamic range, low latency, and high temporal resolution. These features enable them to perform well in challenging lighting conditions, such as low-light or backlit environments. Moreover, their ability to detect fine-grained motion events makes them suitable for applications like pedestrian detection and vehicle-to-infrastructure communication via visible light. In this study, we present a method for distance estimation using a monocular event camera and a roadside LED bar. By applying a phase-only correlation technique to the event data, we achieve sub-pixel precision in detecting the spatial shift between two light sources. This enables accurate triangulation-based distance estimation without requiring stereo vision. Field experiments conducted in outdoor driving scenarios demonstrated that the proposed approach achieves over 90% success rate with less than 0.5-meter error for distances ranging from 20 to 60 meters. Future work includes extending this method to full position estimation by leveraging infrastructure such as smart poles equipped with LEDs, enabling event-camera-based vehicles to determine their own position in real time. This advancement could significantly enhance navigation accuracy, route optimization, and integration into intelligent transportation systems.