SurgCalib: Gaussian Splatting-Based Hand-Eye Calibration for Robot-Assisted Minimally Invasive Surgery

πŸ“… 2026-03-09
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
This work addresses the challenge of inaccurate hand-eye calibration in da Vinci surgical robots under markerless, sterile conditions caused by encoder errors. To overcome this, the authors propose a fully automatic, external-marker-free calibration framework that uniquely integrates differentiable Gaussian splatting rendering with Remote Center of Motion (RCM) constraints. The method leverages raw kinematic data to initialize instrument poses and refines them through a two-stage optimization process. Evaluated on the SurgPose dataset, the approach achieves 2D reprojection errors of 12.24 pixels (2.06 mm) and 11.33 pixels (1.90 mm) for the left and right instruments, respectively, along with 3D Euclidean distance errors of 5.98 mm and 4.75 mm. These results demonstrate the method’s high accuracy while preserving sterility during robotic surgery.

Technology Category

Application Category

πŸ“ Abstract
We present a Gaussian Splatting-based framework for hand-eye calibration of the da Vinci surgical robot. In a vision-guided robotic system, accurate estimation of the rigid transformation between the robot base and the camera frame is essential for reliable closed-loop control. For cable-driven surgical robots, this task faces unique challenges. The encoders of surgical instruments often produce inaccurate proprioceptive measurements due to cable stretch and backlash. Conventional hand-eye calibration approaches typically rely on known fiducial patterns and solve the AX = XB formulation. While effective, introducing additional markers into the operating room (OR) environment can violate sterility protocols and disrupt surgical workflows. In this study, we propose SurgCalib, an automatic, markerless framework that has the potential to be used in the OR. SurgCalib first initializes the pose of the surgical instrument using raw kinematic measurements and subsequently refines this pose through a two-phase optimization procedure under the RCM constraint within a Gaussian Splatting-based differentiable rendering pipeline. We evaluate the proposed method on the public dVRK benchmark, SurgPose. The results demonstrate average 2D tool-tip reprojection errors of 12.24 px (2.06 mm) and 11.33 px (1.9 mm), and 3D tool-tip Euclidean distance errors of 5.98 mm and 4.75 mm, for the left and right instruments, respectively.
Problem

Research questions and friction points this paper is trying to address.

hand-eye calibration
robot-assisted surgery
markerless
cable-driven robots
sterility constraints
Innovation

Methods, ideas, or system contributions that make the work stand out.

Gaussian Splatting
hand-eye calibration
markerless
robot-assisted surgery
differentiable rendering
πŸ”Ž Similar Papers
No similar papers found.
Zijian Wu
Zijian Wu
University of British Columbia
Surgical RoboticsImage Guided SurgeryRobot-assisted Surgery
S
Shuojue Yang
Department of Biomedical Engineering, National University of Singapore, Singapore
Y
Yu Chung Lee
Robotics and Control Laboratory, University of British Columbia, Vancouver, BC, Canada
E
Eitan Prisman
Division of Otolaryngology – Head and Neck Surgery, University of British Columbia, Vancouver, BC, Canada
Yueming Jin
Yueming Jin
Assistant Professor, National University of Singapore
Medical Image AnalysisSurgical AI&RoboticsMultimodal Learning
S
Septimiu E. Salcudean
Robotics and Control Laboratory, University of British Columbia, Vancouver, BC, Canada