🤖 AI Summary
Traditional head-and-neck anatomy education relies on textbooks, physical models, and cadaveric dissection—methods that present spatial relationships abstractly, offer limited interactivity, and hinder accurate virtual-to-physical registration. To address these limitations, this study proposes a mixed reality (MR)-based immersive anatomy teaching system. The system employs hierarchical information visualization and an automated spatial calibration module to support progressive learning—from macroscopic anatomical labeling to microscopic structural exploration. By integrating real-time pose estimation, hand gesture recognition, and MR see-through rendering, it achieves millimeter-accurate spatial alignment between virtual 3D anatomical models and the user’s actual head. Experimental evaluation demonstrates significant improvements in spatial comprehension accuracy and interactive immersion compared to conventional methods. This work establishes a novel, high-fidelity, scalable, and highly interactive MR pedagogical paradigm for medical education.
📝 Abstract
Extended reality (XR), encompassing Virtual Reality (VR), Augmented Reality (AR), and Mixed Reality (MR), is emerging as a transformative platform for medical education. Traditional methods such as textbooks, physical models, and cadaveric dissections often lack interactivity and fail to convey complex spatial relationships effectively. The emerging MR technology addresses these limitations by providing immersive environments that blend virtual elements with real-world contexts. This study presents an MR application for head anatomy education, enabling learners to intuitively interact with see-through 3D anatomical structures via hand gestures and controllers. Our hierarchical information design supports progressive learning, guiding users from basic anatomical labels to detailed structural insights. Additionally, the system incorporates an automatic calibration module that aligns virtual anatomical models with a real human head, thereby facilitating realistic human-model interactions. Experiments show that the system can effectively match the anatomical model with real-time scenes, thus enhancing the interactivity and immersion of medical education, providing an innovative tool for teaching anatomy.