EasyREG: Easy Depth-Based Markerless Registration and Tracking using Augmented Reality Device for Surgical Guidance

📅 2025-04-13
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the challenges of markerless, high-precision, and real-time anatomical registration and tracking in AR-assisted surgery, this paper proposes the first dual-module framework relying solely on the depth sensor of an AR device. The registration module integrates human-in-the-loop region filtering, curvature-aware feature sampling, and robust ICP optimization; the tracking module enables fast, outlier-resilient iterative closest point (ICP) estimation from an initial pose. Our method eliminates reliance on external markers and RGB data, thereby avoiding registration failure caused by occlusion and cross-modal discrepancies. Evaluated on both synthetic and real surgical datasets, it achieves significantly higher registration accuracy than state-of-the-art industrial solutions, while matching their tracking stability and real-time performance for both dynamic and static anatomical targets. This work establishes a novel, clinically deployable paradigm for markerless AR navigation.

Technology Category

Application Category

📝 Abstract
The use of Augmented Reality (AR) devices for surgical guidance has gained increasing traction in the medical field. Traditional registration methods often rely on external fiducial markers to achieve high accuracy and real-time performance. However, these markers introduce cumbersome calibration procedures and can be challenging to deploy in clinical settings. While commercial solutions have attempted real-time markerless tracking using the native RGB cameras of AR devices, their accuracy remains questionable for medical guidance, primarily due to occlusions and significant outliers between the live sensor data and the preoperative target anatomy point cloud derived from MRI or CT scans. In this work, we present a markerless framework that relies only on the depth sensor of AR devices and consists of two modules: a registration module for high-precision, outlier-robust target anatomy localization, and a tracking module for real-time pose estimation. The registration module integrates depth sensor error correction, a human-in-the-loop region filtering technique, and a robust global alignment with curvature-aware feature sampling, followed by local ICP refinement, for markerless alignment of preoperative models with patient anatomy. The tracking module employs a fast and robust registration algorithm that uses the initial pose from the registration module to estimate the target pose in real-time. We comprehensively evaluated the performance of both modules through simulation and real-world measurements. The results indicate that our markerless system achieves superior performance for registration and comparable performance for tracking to industrial solutions. The two-module design makes our system a one-stop solution for surgical procedures where the target anatomy moves or stays static during surgery.
Problem

Research questions and friction points this paper is trying to address.

Markerless surgical guidance using AR depth sensors
Robust registration and tracking without fiducial markers
Real-time pose estimation for moving or static anatomy
Innovation

Methods, ideas, or system contributions that make the work stand out.

Depth sensor-based markerless registration for surgical guidance
Human-in-the-loop region filtering for outlier robustness
Real-time tracking with curvature-aware feature sampling
🔎 Similar Papers
No similar papers found.