🤖 AI Summary
This work proposes MARVUS, a novel system that integrates mobile augmented reality (AR) with foundation models to enable accurate and scalable 3D reconstruction of breast and thyroid lesions using only conventional 2D ultrasound equipment. Addressing the high inter-user variability inherent in manual 2D volume assessments and the cost and portability limitations of existing 3D ultrasound solutions—which typically require specialized probes or external tracking hardware—MARVUS operates without any additional hardware. In phantom experiments on breast lesions, the system achieves a mean volume estimation error of 0.469 cm³ and reduces inter-user variability to 0.417 cm³, significantly improving reconstruction accuracy, cross-specialty generalizability, and clinical practicality.
📝 Abstract
Accurate volumetric characterization of lesions is essential for oncologic diagnosis, risk stratification, and treatment planning. While imaging modalities such as Computed Tomography provide high-quality 3D data, 2D ultrasound (2D-US) remains the preferred first-line modality for breast and thyroid imaging due to cost, portability, and safety factors. However, volume estimates derived from 2D-US suffer from high inter-user variability even among experienced clinicians. Existing 3D ultrasound (3D-US) solutions use specialized probes or external tracking hardware, but such configurations increase costs and diminish portability, constraining widespread clinical use. To address these limitations, we present Mobile Augmented Reality Volumetric Ultrasound (MARVUS), a resource-efficient system designed to increase accessibility to accurate and reproducible volumetric assessment. MARVUS is interoperable with conventional ultrasound (US) systems, using a foundation model to enhance cross-specialty generalization while minimizing hardware requirements relative to current 3D-US solutions. In a user study involving experienced clinicians performing measurements on breast phantoms, MARVUS yielded a substantial improvement in volume estimation accuracy (mean difference: 0.469 cm3) with reduced inter-user variability (mean difference: 0.417 cm3). Additionally, we prove that augmented reality (AR) visualizations enhance objective performance metrics and clinician-reported usability. Collectively, our findings suggests that MARVUS can enhance US-based cancer screening, diagnostic workflows, and treatment planning in a scalable, cost-conscious, and resource-efficient manner. Usage video demonstration available (https://youtu.be/m4llYcZpqmM).