LRBO2: Improved 3D Vision Based Hand-Eye Calibration for Collaborative Robot Arm

📅 2025-04-30
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the inefficiency of conventional collaborative robot hand-eye calibration—namely, its reliance on external calibration targets, multiple robotic motions, and prolonged execution time—this paper proposes a target-free, single-pose point-cloud-based hand-eye calibration method. The approach leverages point-cloud registration to synthesize simulation data and jointly employs Iterative Closest Point (ICP) and deep feature matching to estimate the rigid-body transformation between the camera and robot end-effector. Furthermore, cross-platform kinematic modeling and robust optimization ensure compatibility across 14 robot models from nine major manufacturers, including KUKA, Universal Robots, and Franka Emika. Experimental results demonstrate that the method achieves commercial-grade accuracy while completing calibration in only a few seconds—substantially outperforming traditional approaches in both speed and practicality. The source code is publicly available.

Technology Category

Application Category

📝 Abstract
Hand-eye calibration is a common problem in the field of collaborative robotics, involving the determination of the transformation matrix between the visual sensor and the robot flange to enable vision-based robotic tasks. However, this process typically requires multiple movements of the robot arm and an external calibration object, making it both time-consuming and inconvenient, especially in scenarios where frequent recalibration is necessary. In this work, we extend our previous method, Look at Robot Base Once (LRBO), which eliminates the need for external calibration objects such as a chessboard. We propose a generic dataset generation approach for point cloud registration, focusing on aligning the robot base point cloud with the scanned data. Furthermore, a more detailed simulation study is conducted involving several different collaborative robot arms, followed by real-world experiments in an industrial setting. Our improved method is simulated and evaluated using a total of 14 robotic arms from 9 different brands, including KUKA, Universal Robots, UFACTORY, and Franka Emika, all of which are widely used in the field of collaborative robotics. Physical experiments demonstrate that our extended approach achieves performance comparable to existing commercial hand-eye calibration solutions, while completing the entire calibration procedure in just a few seconds. In addition, we provide a user-friendly hand-eye calibration solution, with the code publicly available at github.com/leihui6/LRBO2.
Problem

Research questions and friction points this paper is trying to address.

Eliminates need for external calibration objects
Improves hand-eye calibration speed and convenience
Validates method across multiple collaborative robot brands
Innovation

Methods, ideas, or system contributions that make the work stand out.

Eliminates need for external calibration objects
Uses point cloud registration for alignment
Achieves fast calibration in seconds
🔎 Similar Papers
No similar papers found.
L
Leihui Li
Department of Mechanical and Production Engineering, Aarhus University, Aarhus, Denmark
L
Lixuepiao Wan
Department of Mechanical and Production Engineering, Aarhus University, Aarhus, Denmark
Volker Krueger
Volker Krueger
Lund University
roboticscomputer visionmachine intelligenceimage processing
X
Xuping Zhang
Department of Mechanical and Production Engineering, Aarhus University, Aarhus, Denmark