🤖 AI Summary
To address the challenge of real-time reconstruction of high-quality intensity images from event cameras mounted on UAVs under low-illumination and high-dynamic-range conditions, this paper proposes an Event Single-integration (ESI) framework coupled with enhanced decay fusion. The method achieves ≥100 FPS real-time intensity image reconstruction while preserving the event camera’s inherent advantages—namely, asynchronous operation and ultra-low power consumption—thereby significantly reducing computational overhead. By optimizing spatiotemporal event integration strategies and refining photometric decay modeling, the approach improves signal-to-noise ratio and structural fidelity in extremely low-light scenarios (2–10 lux). Experimental results demonstrate that our method outperforms state-of-the-art approaches across reconstruction quality, frame rate, and runtime efficiency. It enables seamless migration of conventional frame-based vision algorithms to the event domain and supports robust visual tracking under adverse illumination conditions.
📝 Abstract
Event cameras offer significant advantages, including a wide dynamic range, high temporal resolution, and immunity to motion blur, making them highly promising for addressing challenging visual conditions. Extracting and utilizing effective information from asynchronous event streams is essential for the onboard implementation of event cameras. In this paper, we propose a streamlined event-based intensity reconstruction scheme, event-based single integration (ESI), to address such implementation challenges. This method guarantees the portability of conventional frame-based vision methods to event-based scenarios and maintains the intrinsic advantages of event cameras. The ESI approach reconstructs intensity images by performing a single integration of the event streams combined with an enhanced decay algorithm. Such a method enables real-time intensity reconstruction at a high frame rate, typically 100 FPS. Furthermore, the relatively low computation load of ESI fits onboard implementation suitably, such as in UAV-based visual tracking scenarios. Extensive experiments have been conducted to evaluate the performance comparison of ESI and state-of-the-art algorithms. Compared to state-of-the-art algorithms, ESI demonstrates remarkable runtime efficiency improvements, superior reconstruction quality, and a high frame rate. As a result, ESI enhances UAV onboard perception significantly under visual adversary surroundings. In-flight tests, ESI demonstrates effective performance for UAV onboard visual tracking under extremely low illumination conditions(2-10lux), whereas other comparative algorithms fail due to insufficient frame rate, poor image quality, or limited real-time performance.