Event-Based Method for High-Speed 3D Deformation Measurement under Extreme Illumination Conditions

๐Ÿ“… 2026-03-30
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
This study addresses the challenge of accurately capturing high-speed three-dimensional deformations of large-scale engineering structures under extreme illumination conditions, where conventional cameras suffer from limited dynamic range. The work presents the first complete application of a multi-event camera array to this problem, proposing an end-to-end, high-precision measurement framework. By leveraging asynchronous event streams and temporal correlation analysis, fiducial markers are extracted; Kruppa equations enable rapid self-calibration; and unified coordinate transformation combined with linear intersection facilitates 3D reconstruction. The method achieves fully automatic multi-camera self-calibration and deformation measurement under harsh lighting, attaining a relative error below 0.08%โ€”significantly outperforming existing approaches.
๐Ÿ“ Abstract
Background: Large engineering structures, such as space launch towers and suspension bridges, are subjected to extreme forces that cause high-speed 3D deformation and compromise safety. These structures typically operate under extreme illumination conditions. Traditional cameras often struggle to handle strong light intensity, leading to overexposure due to their limited dynamic range. Objective: Event cameras have emerged as a compelling alternative to traditional cameras in high dynamic range and low-latency applications. This paper presents an integrated method, from calibration to measurement, using a multi-event camera array for high-speed 3D deformation monitoring of structures in extreme illumination conditions. Methods: Firstly, the proposed method combines the characteristics of the asynchronous event stream and temporal correlation analysis to extract the corresponding marker center point. Subsequently, the method achieves rapid calibration by solving the Kruppa equations in conjunction with a parameter optimization framework. Finally, by employing a unified coordinate transformation and linear intersection, the method enables the measurement of 3D deformation of the target structure. Results: Experiments confirmed that the relative measurement error is below 0.08%. Field experiments under extreme illumination conditions, including self-calibration of a multi-event camera array and 3D deformation measurement, verified the performance of the proposed method. Conclusions: This paper addressed the critical limitation of traditional cameras in measuring high-speed 3D deformations under extreme illumination conditions. The experimental results demonstrate that, compared to other methods, the proposed method can accurately measure 3D deformations of structures under harsh lighting conditions, and the relative error of the measured deformation is less than 0.1%.
Problem

Research questions and friction points this paper is trying to address.

3D deformation measurement
extreme illumination conditions
high-speed deformation
event cameras
structural monitoring
Innovation

Methods, ideas, or system contributions that make the work stand out.

event camera
3D deformation measurement
extreme illumination
high-speed monitoring
self-calibration
๐Ÿ”Ž Similar Papers
No similar papers found.
Banglei Guan
Banglei Guan
National University of Defense Technology
PhotomechanicsVideometrics
Y
Yifei Bian
College of Aerospace Science and Engineering, National University of Defense Technology, Changsha 410073, China; Hunan Provincial Key Laboratory of Image Measurement and Vision Navigation, Changsha 410073, China
Zibin Liu
Zibin Liu
National University of Defense Technology
Neuromorphic vision sensorsEvent cameraCamera calibrationPose estimationObject tracking
Haoyang Li
Haoyang Li
PhD, University of Technology Sydney
Vision-Language ModelRobotics
X
Xuanyu Bai
College of Aerospace Science and Engineering, National University of Defense Technology, Changsha 410073, China; Hunan Provincial Key Laboratory of Image Measurement and Vision Navigation, Changsha 410073, China
T
Taihang Lei
College of Aerospace Science and Engineering, National University of Defense Technology, Changsha 410073, China; Hunan Provincial Key Laboratory of Image Measurement and Vision Navigation, Changsha 410073, China
B
Bin Li
College of Aerospace Science and Engineering, National University of Defense Technology, Changsha 410073, China; Hunan Provincial Key Laboratory of Image Measurement and Vision Navigation, Changsha 410073, China
Y
Yang Shang
College of Aerospace Science and Engineering, National University of Defense Technology, Changsha 410073, China; Hunan Provincial Key Laboratory of Image Measurement and Vision Navigation, Changsha 410073, China
Q
Qifeng Yu
College of Aerospace Science and Engineering, National University of Defense Technology, Changsha 410073, China; Hunan Provincial Key Laboratory of Image Measurement and Vision Navigation, Changsha 410073, China