🤖 AI Summary
This study addresses key limitations of conventional 2D electronic health record (EHR) interfaces—namely, poor interactivity, lack of spatial context, and low clinical collaboration efficiency. To this end, we propose an immersive EHR visualization framework leveraging extended reality (XR). Our modular architecture integrates FHIR-standardized structured EHR data, CT volumetric datasets, and AI-generated medical image segmentation outputs, enabling spatial mapping, real-time rendering, and natural interaction with multimodal patient data within a shared 3D environment. The platform ensures cross-system interoperability and demonstrates feasibility using synthetic EHR data and AI-reconstructed 3D spinal models. To our knowledge, this is the first work to deeply integrate FHIR, AI-driven segmentation, and XR technologies, establishing a novel paradigm for collaborative, explorable, and extensible clinical data interaction. The framework lays critical technical foundations for next-generation immersive clinical decision support systems.
📝 Abstract
This paper presents the design and implementation of an Extended Reality (XR) platform for immersive, interactive visualization of Electronic Health Records (EHRs). The system extends beyond conventional 2D interfaces by visualizing both structured and unstructured patient data into a shared 3D environment, enabling intuitive exploration and real-time collaboration. The modular infrastructure integrates FHIR-based EHR data with volumetric medical imaging and AI-generated segmentation, ensuring interoperability with modern healthcare systems. The platform's capabilities are demonstrated using synthetic EHR datasets and computed tomography (CT)-derived spine models processed through an AI-powered segmentation pipeline. This work suggests that such integrated XR solutions could form the foundation for next-generation clinical decision-support tools, where advanced data infrastructures are directly accessible in an interactive and spatially rich environment.