Towards Motion Compensation in Autonomous Robotic Subretinal Injections

📅 2024-11-27
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In subretinal injection for wet age-related macular degeneration, physiological Z-axis micro-movements of the retina—induced by respiration and cardiac pulsation—significantly compromise targeting accuracy. To address this, we propose a real-time, closed-loop robotic motion compensation system leveraging optical coherence tomography (OCT). Our method employs high-speed, small-volume B⁵-scan OCT (>100 Hz) for real-time retinal Z-axis tracking and implements a dynamic prediction–feedback compensation framework. Ex vivo porcine eye experiments demonstrate effective suppression of Z-axis physiological motion (25–100 μm); however, residual horizontal drift causes 80–200 μm needle-tip targeting errors. This study is the first to reveal the critical role of horizontal stability and motion prediction in achieving precise subretinal injection. The results establish a novel paradigm and provide empirical validation for clinical-grade robotic ophthalmic surgery.

Technology Category

Application Category

📝 Abstract
Exudative (wet) age-related macular degeneration (AMD) is a leading cause of vision loss in older adults, typically treated with intravitreal injections. Emerging therapies, such as subretinal injections of stem cells, gene therapy, small molecules or RPE cells require precise delivery to avoid damaging delicate retinal structures. Autonomous robotic systems can potentially offer the necessary precision for these procedures. This paper presents a novel approach for motion compensation in robotic subretinal injections, utilizing real-time Optical Coherence Tomography (OCT). The proposed method leverages B$^{5}$-scans, a rapid acquisition of small-volume OCT data, for dynamic tracking of retinal motion along the Z-axis, compensating for physiological movements such as breathing and heartbeat. Validation experiments on extit{ex vivo} porcine eyes revealed challenges in maintaining a consistent tool-to-retina distance, with deviations of up to 200 $mu m$ for 100 $mu m$ amplitude motions and over 80 $mu m$ for 25 $mu m$ amplitude motions over one minute. Subretinal injections faced additional difficulties, with horizontal shifts causing the needle to move off-target and inject into the vitreous. These results highlight the need for improved motion prediction and horizontal stability to enhance the accuracy and safety of robotic subretinal procedures.
Problem

Research questions and friction points this paper is trying to address.

Develops motion compensation for robotic subretinal injections.
Addresses challenges in precise delivery to avoid retinal damage.
Improves accuracy and safety using real-time OCT tracking.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Real-time OCT for retinal motion tracking
B$^5$-scans for dynamic Z-axis compensation
Enhanced precision in robotic subretinal injections
🔎 Similar Papers
No similar papers found.
D
Demir Arikan
Department of Computer Science, Technische Universität München, Munich 85748 Germany; Laboratory for Computational Sensing and Robotics, Johns Hopkins University, Baltimore, MD, USA
Peiyao Zhang
Peiyao Zhang
Johns Hopkins University
retinal microsurgeryrobotics
Michael Sommersperger
Michael Sommersperger
Technische Universität München
Medical ApplicationsComputer GraphicsDeep LearningMixed Reality
Shervin Dehghani
Shervin Dehghani
Department of Computer Science, Technische Universität München, Munich 85748 Germany
Mojtaba Esfandiari
Mojtaba Esfandiari
Johns Hopkins University
Medical RoboticsContinuum ManipulatorsAutomated SurgeryControl Theory
Russell H. Taylor
Russell H. Taylor
John C. Malone Professor of Computer Science, Johns Hopkins University
RoboticsMedical RoboticsComputer-Integrated SurgeryComputer-Assisted Surgery
M
M. A. Nasseri
Department of Computer Science, Technische Universität München, Munich 85748 Germany; Augenklinik und Poliklinik, Klinikum rechts der Isar der Technische Universität München, München 81675 Germany
P
Peter L. Gehlbach
Wilmer Eye Institute, Johns Hopkins Hospital, Baltimore, MD, USA
Nassir Navab
Nassir Navab
Professor of Computer Science, Technische Universität München
I
I. Iordachita
Laboratory for Computational Sensing and Robotics, Johns Hopkins University, Baltimore, MD, USA