🤖 AI Summary
To address challenges in LiDAR–camera extrinsic calibration—including difficult cross-modal registration, point cloud edge blurring degrading accuracy, and limited generality across solid-state and mechanical LiDARs—this paper proposes a fully automatic calibration method leveraging a custom-designed 3D calibration target. Our approach introduces three key innovations: (1) a scan-pattern-agnostic edge extraction algorithm for robust 3D edge detection; (2) ellipse fitting to compensate for edge expansion caused by laser spot diffusion; and (3) a multi-scene joint nonlinear optimization framework enhancing robustness and generalizability. Evaluated on three distinct LiDAR types, the method achieves a point-to-point reprojection error ≤6.5 mm and completes calibration in ≤0.7 s per instance, outperforming state-of-the-art methods in both accuracy and stability. The source code and dataset are publicly available.
📝 Abstract
This paper proposes FAST-Calib, a fast and user-friendly LiDAR-camera extrinsic calibration tool based on a custom-made 3D target. FAST-Calib supports both mechanical and solid-state LiDARs by leveraging an efficient and reliable edge extraction algorithm that is agnostic to LiDAR scan patterns. It also compensates for edge dilation artifacts caused by LiDAR spot spread through ellipse fitting, and supports joint optimization across multiple scenes. We validate FAST-Calib on three LiDAR models (Ouster, Avia, and Mid360), each paired with a wide-angle camera. Experimental results demonstrate superior accuracy and robustness compared to existing methods. With point-to-point registration errors consistently below 6.5mm and total processing time under 0.7s, FAST-Calib provides an efficient, accurate, and target-based automatic calibration pipeline. We have open-sourced our code and dataset on GitHub to benefit the robotics community.