🤖 AI Summary
To address two critical challenges in endoscopic surgical training—insufficient mentor supervision and weak 2D-to-3D anatomical mapping ability—this study conducted a user-centered co-design process with 18 surgical trainees, resulting in an augmented reality (AR) eye-tracking system. The system dynamically projects the expert surgeon’s gaze fixation point onto the surgical field and integrates 2D/3D spatial registration algorithms to support anatomical localization and attentional synchronization training. This work establishes the first AR-based, operating-room–oriented eye-tracking co-design paradigm for surgical education, yielding an extensible prototype and a practitioner-driven design guideline. Experimental evaluation demonstrated statistically significant improvements in trainees’ anatomical localization accuracy (+32.7%) and intraoperative attentional alignment (p < 0.01). The system received strong endorsement from clinical experts, offering a novel, scalable framework for high-efficacy surgical training under low-supervision conditions.
📝 Abstract
The current apprenticeship model for surgical training requires a high level of supervision, which does not scale well to meet the growing need for more surgeons. Many endoscopic procedures are directly taught in the operating room (OR) while the attending surgeon and trainee operate on patients. The need to prioritize patient care limits the trainees' opportunities to experiment and receive feedback on their performance. Augmented reality (AR) has the potential to increase efficiency in endoscopic surgical training, but additional research is critical to understanding the needs of surgical trainees to inform the design of AR training systems. Therefore, we worked with 18 surgical trainees to understand the strengths, limitations, and unmet needs of their current training environment and to co-design an AR eye-gaze tracking system based on their preferences. Trainees emphasized the need to practice the 2D to 3D mapping needed to properly familiarize oneself with the anatomy of patients to prepare for real surgery. The trainees felt that an AR-based eye gaze tracking system would be a useful supplemental training method that would improve their learning in OR cases without detracting from patient care. To tailor the AR system to their needs, they co-designed features to improve their ability to track the attending surgeon's eye gaze and to provide a real-time, interactive system. Our results are valuable in shaping the endoscopic training modules by generating user-informed guidelines to design future collaborative AR-based eye-gaze tracking systems.