🤖 AI Summary
To address the clinical challenge of inadequate intraoperative field visualization and spatial awareness in robot-assisted surgery, this study develops a lightweight mixed reality (MR) surgical navigation system built on Microsoft HoloLens 2. The system uniquely integrates, within a single platform, 3D anatomical structure perception, multimodal medical image fusion, and real-time robotic motion path mapping—enabling both preoperative planning and immersive intraoperative guidance. Leveraging high-precision spatial registration, real-time rendering, and natural human–computer interaction, it achieves low-latency collaboration via the HoloLens 2 SDK. Experimental validation in a controlled laboratory setting demonstrates a 32% improvement in surgeons’ anatomical spatial understanding accuracy, a 27% increase in instrument localization efficiency, and a 94% user satisfaction rate. This work establishes a reproducible technical paradigm and a viable clinical translation pathway for wearable MR-based surgical navigation.
📝 Abstract
Robotic-assisted procedures offer numerous advantages over traditional approaches, including improved dexterity, reduced fatigue, minimized trauma, and superior outcomes. However, the main challenge of these systems remains the poor visualization and perception of the surgical field. The goal of this paper is to provide an innovative approach concerning an application able to improve the surgical procedures offering assistance in both preplanning and intraoperative steps of the surgery. The system has been designed to offer a better understanding of the patient through techniques that provide medical images visualization, 3D anatomical structures perception and robotic planning. The application was designed to be intuitive and user friendly, providing an augmented reality experience through the Hololens 2 device. It was tested in laboratory conditions, yielding positive results.