Learning From a Steady Hand: A Weakly Supervised Agent for Robot Assistance under Microscopy

📅 2026-01-28
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the reliance on manual annotation and complex calibration in robot-assisted microsurgery under microscopic guidance by proposing a weakly supervised framework that integrates calibration-aware vision with admittance control. The approach leverages reusable warm-up trajectories to implicitly extract spatial information, enabling depth-resolved perception without external markers or manual depth annotations. A task-space error budgeting mechanism—introduced here for the first time—is combined with residual modeling and uncertainty quantification to achieve high-precision manipulation without explicit labeling. Experimental results demonstrate a lateral closed-loop accuracy of 49 µm (95% confidence) and a depth accuracy of ≤291 µm, while reducing user workload by 77.1%, thereby significantly enhancing system reliability and practicality.

Technology Category

Application Category

📝 Abstract
This paper rethinks steady-hand robotic manipulation by using a weakly supervised framework that fuses calibration-aware perception with admittance control. Unlike conventional automation that relies on labor-intensive 2D labeling, our framework leverages reusable warm-up trajectories to extract implicit spatial information, thereby achieving calibration-aware, depth-resolved perception without the need for external fiducials or manual depth annotation. By explicitly characterizing residuals from observation and calibration models, the system establishes a task-space error budget from recorded warm-ups. The uncertainty budget yields a lateral closed-loop accuracy of approx. 49 micrometers at 95% confidence (worst-case testing subset) and a depth accuracy of<= 291 micrometers at 95% confidence bound during large in-plane moves. In a within-subject user study (N=8), the learned agent reduces overall NASA-TLX workload by 77.1% relative to the simple steady-hand assistance baseline. These results demonstrate that the weakly supervised agent improves the reliability of microscope-guided biomedical micromanipulation without introducing complex setup requirements, offering a practical framework for microscope-guided intervention.
Problem

Research questions and friction points this paper is trying to address.

steady-hand robotics
weakly supervised learning
microscopy-guided micromanipulation
calibration-aware perception
robot assistance
Innovation

Methods, ideas, or system contributions that make the work stand out.

weakly supervised learning
calibration-aware perception
admittance control
microscopic micromanipulation
error budgeting
Huanyu Tian
Huanyu Tian
Postdoctoral Research Associate at King's College London
RoboticsShared autonomySurgical navigation
M
Martin Huber
School of Biomedical Engineering & Imaging Sciences, King’s College London, UK
L
Lingyun Zeng
School of Biomedical Engineering & Imaging Sciences, King’s College London, UK
Zhe Han
Zhe Han
King's College London
Medical Imaging
W
Wayne Bennett
Conceivable Life Sciences, New York City, US and London, UK
G
Giuseppe Silvestri
Conceivable Life Sciences, New York City, US and London, UK
G
Gerardo Mendizabal-Ruiz
Conceivable Life Sciences, New York City, US and London, UK
Tom Vercauteren
Tom Vercauteren
Professor of Interventional Image Computing, King's College London
Medical Image ComputingImage RegistrationComputer-assisted InterventionsEndomicroscopyImage-guided Interventions
A
Alejandro Chavez-Badiola
Hope IVF Mexico, Mexico
C
Christos Bergeles
School of Biomedical Engineering & Imaging Sciences, King’s College London, UK