🤖 AI Summary
To address the low-latency, motion-blur-resilient communication requirements for vehicle-to-vehicle (V2V) and vehicle-to-infrastructure (V2I) applications in ADAS and autonomous driving, this paper proposes an in-vehicle visible light communication (VLC) system leveraging event cameras and multi-LED arrays. Methodologically, it introduces— for the first time—event cameras for dynamic VLC reception, exploiting their asynchronous, microsecond-level response to overcome motion blur inherent in conventional frame-based cameras. A Walsh–Hadamard-coded pilot sequence is designed to enable robust LED source localization and dense signal separation under high-speed mobility. Furthermore, LED array modulation is integrated with motion-adaptive optical signal tracking and demodulation. Real-vehicle experiments demonstrate zero bit-error rate at a 40 m link distance and 30 km/h relative velocity, validating the system’s feasibility and reliability in realistic driving scenarios.
📝 Abstract
In the fields of Advanced Driver Assistance Systems (ADAS) and Autonomous Driving (AD), sensors that serve as the ``eyes'' for sensing the vehicle's surrounding environment are essential. Traditionally, image sensors and LiDAR have played this role. However, a new type of vision sensor, event cameras, has recently attracted attention. Event cameras respond to changes in the surrounding environment (e.g., motion), exhibit strong robustness against motion blur, and perform well in high dynamic range environments, which are desirable in robotics applications. Furthermore, the asynchronous and low-latency principles of data acquisition make event cameras suitable for optical communication. By adding communication functionality to event cameras, it becomes possible to utilize I2V communication to immediately share information about forward collisions, sudden braking, and road conditions, thereby contributing to hazard avoidance. Additionally, receiving information such as signal timing and traffic volume enables speed adjustment and optimal route selection, facilitating more efficient driving. In this study, we construct a vehicle visible light communication system where event cameras are receivers, and multiple LEDs are transmitters. In driving scenes, the system tracks the transmitter positions and separates densely packed LED light sources using pilot sequences based on Walsh-Hadamard codes. As a result, outdoor vehicle experiments demonstrate error-free communication under conditions where the transmitter-receiver distance was within 40 meters and the vehicle's driving speed was 30 km/h (8.3 m/s).