LECalib: Line-Based Event Camera Calibration

📅 2025-12-26
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing event camera calibration methods rely on artificial calibration targets (e.g., checkerboards or flashing patterns) or reconstructed intensity images, resulting in high manual effort, long calibration times, and poor adaptability to dynamic scenes. To address these limitations, this paper proposes a novel intrinsic calibration method based solely on geometric line features—requiring no calibration board, flashing pattern, or image reconstruction. The approach directly detects straight-line structures (e.g., door/window edges) from raw event streams in man-made environments, constructs an event-to-line geometric model, and performs robust initial estimation followed by end-to-end nonlinear least-squares optimization. It supports both planar and non-planar lines and is compatible with monocular and stereo event cameras. Extensive evaluation on synthetic and real-world datasets demonstrates significant improvements in calibration efficiency and scene adaptability, achieving state-of-the-art accuracy. The source code is publicly available.

Technology Category

Application Category

📝 Abstract
Camera calibration is an essential prerequisite for event-based vision applications. Current event camera calibration methods typically involve using flashing patterns, reconstructing intensity images, and utilizing the features extracted from events. Existing methods are generally time-consuming and require manually placed calibration objects, which cannot meet the needs of rapidly changing scenarios. In this paper, we propose a line-based event camera calibration framework exploiting the geometric lines of commonly-encountered objects in man-made environments, e.g., doors, windows, boxes, etc. Different from previous methods, our method detects lines directly from event streams and leverages an event-line calibration model to generate the initial guess of camera parameters, which is suitable for both planar and non-planar lines. Then, a non-linear optimization is adopted to refine camera parameters. Both simulation and real-world experiments have demonstrated the feasibility and accuracy of our method, with validation performed on monocular and stereo event cameras. The source code is released at https://github.com/Zibin6/line_based_event_camera_calib.
Problem

Research questions and friction points this paper is trying to address.

Calibrates event cameras using environmental lines
Eliminates need for manual calibration objects
Works with both planar and non-planar lines
Innovation

Methods, ideas, or system contributions that make the work stand out.

Line-based calibration from event streams
Event-line model for initial parameter estimation
Non-linear optimization for parameter refinement
🔎 Similar Papers
No similar papers found.
Zibin Liu
Zibin Liu
National University of Defense Technology
Neuromorphic vision sensorsEvent cameraCamera calibrationPose estimationObject tracking
Banglei Guan
Banglei Guan
National University of Defense Technology
PhotomechanicsVideometrics
Y
Yang Shang
The College of Aerospace Science and Engineering, National University of Defense Technology, Changsha 410073, China
Z
Zhenbao Yu
The GNSS Research Center, Wuhan University, Wuhan 430000, China
Y
Yifei Bian
The College of Aerospace Science and Engineering, National University of Defense Technology, Changsha 410073, China
Q
Qifeng Yu
The College of Aerospace Science and Engineering, National University of Defense Technology, Changsha 410073, China