PLK-Calib: Single-shot and Target-less LiDAR-Camera Extrinsic Calibration using Pl""ucker Lines

📅 2025-03-11
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the limitations of conventional LiDAR–camera extrinsic calibration—namely, reliance on artificial calibration targets and multiple data acquisitions—this paper proposes a single-shot, targetless, line-feature-driven calibration method. Leveraging geometric constraints of common perpendicularity and parallelism among 3D lines represented in Plücker coordinates, we achieve, for the first time, theoretical decoupling of rotational and translational parameter estimation. We rigorously prove that only three non-parallel corresponding 3D–2D line pairs are sufficient for unique solution recovery, overcoming the traditional dependence on point- or plane-based features. The method formulates and minimizes line reprojection error, incorporates degeneracy analysis, and validates robustness via Monte Carlo simulations. Evaluated on a custom multi-scene dataset, our approach achieves mean rotational and translational errors of 0.12° and 1.8 cm, respectively—substantially outperforming existing targetless methods.

Technology Category

Application Category

📝 Abstract
Accurate LiDAR-Camera (LC) calibration is challenging but crucial for autonomous systems and robotics. In this paper, we propose two single-shot and target-less algorithms to estimate the calibration parameters between LiDAR and camera using line features. The first algorithm constructs line-to-line constraints by defining points-to-line projection errors and minimizes the projection error. The second algorithm (PLK-Calib) utilizes the co-perpendicular and co-parallel geometric properties of lines in Pl""ucker (PLK) coordinate, and decouples the rotation and translation into two constraints, enabling more accurate estimates. Our degenerate analysis and Monte Carlo simulation indicate that three nonparallel line pairs are the minimal requirements to estimate the extrinsic parameters. Furthermore, we collect an LC calibration dataset with varying extrinsic under three different scenarios and use it to evaluate the performance of our proposed algorithms.
Problem

Research questions and friction points this paper is trying to address.

Accurate LiDAR-Camera calibration for autonomous systems.
Single-shot, target-less algorithms using line features.
Minimal three nonparallel line pairs for calibration.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Single-shot, target-less LiDAR-camera calibration
Uses Plücker lines for geometric constraints
Decouples rotation and translation for accuracy
🔎 Similar Papers
No similar papers found.
Yanyu Zhang
Yanyu Zhang
Ph.D., University of California, Riverside
SLAMVisual Inertial NavigationComputer Vision
J
Jie Xu
Department of Electrical and Computer Engineering, University of California, Riverside, CA, 92521, USA
W
Wei Ren
Department of Electrical and Computer Engineering, University of California, Riverside, CA, 92521, USA