🤖 AI Summary
Event cameras suffer from low geometric calibration accuracy in long-range measurement scenarios. To address this, we propose a novel calibration method integrating an optical collimator with a scintillating star field—marking the first use of a collimator for event camera calibration. Our approach establishes a joint framework combining spherical motion modeling for initial parameter estimation and Levenberg–Marquardt nonlinear optimization, while ensuring precise spatiotemporal alignment of event streams. Evaluated in long-distance settings, the method achieves sub-pixel calibration accuracy: intrinsic parameter errors are reduced by 42%, and extrinsic rotational deviations remain below 0.008°. These results significantly outperform state-of-the-art methods, demonstrating industrial-grade robustness and reproducibility.
📝 Abstract
Event cameras are a new type of brain-inspired visual sensor with advantages such as high dynamic range and high temporal resolution. The geometric calibration of event cameras, which involves determining their intrinsic and extrinsic parameters, particularly in long-range measurement scenarios, remains a significant challenge. To address the dual requirements of long-distance and high-precision measurement, we propose an event camera calibration method utilizing a collimator with flickering star-based patterns. The proposed method first linearly solves camera parameters using the sphere motion model of the collimator, followed by nonlinear optimization to refine these parameters with high precision. Through comprehensive real-world experiments across varying conditions, we demonstrate that the proposed method consistently outperforms existing event camera calibration methods in terms of accuracy and reliability.