🤖 AI Summary
Conventional freehand ultrasound is highly operator-dependent, resulting in inconsistent image quality, poor reproducibility, and operator fatigue. Method: We propose a novel robotic ultrasound system featuring a custom active-sensing end-effector that fuses dual RGB-D cameras to enable real-time, dense contact-surface reconstruction and online surface normal estimation—without requiring preoperative anatomical modeling—combined with closed-loop pose control and a passively compliant mechanical design to dynamically maintain transducer perpendicularity to the skin. Contribution/Results: Experiments yield surface normal estimation errors of 2.47±1.25° on planar surfaces and 12.19±5.81° on anthropomorphic curved phantoms; acquired ultrasound images match manual scanning quality. Critically, we demonstrate in vivo clinical feasibility for the first time on human forearms. This work establishes a standardized, automated ultrasound imaging paradigm suitable for resource-constrained settings.
📝 Abstract
Conventional freehand ultrasound (US) imaging is highly dependent on the skill of the operator, often leading to inconsistent results and increased physical demand on sonographers. Robotic Ultrasound Systems (RUSS) aim to address these limitations by providing standardized and automated imaging solutions, especially in environments with limited access to skilled operators. This paper presents the development of a novel RUSS system that employs dual RGB-D depth cameras to maintain the US probe normal to the skin surface, a critical factor for optimal image quality. Our RUSS integrates RGB-D camera data with robotic control algorithms to maintain orthogonal probe alignment on uneven surfaces without preoperative data. Validation tests using a phantom model demonstrate that the system achieves robust normal positioning accuracy while delivering ultrasound images comparable to those obtained through manual scanning. A-SEE2.0 demonstrates 2.47 ${pm}$ 1.25 degrees error for flat surface normal-positioning and 12.19 ${pm}$ 5.81 degrees normal estimation error on mannequin surface. This work highlights the potential of A-SEE2.0 to be used in clinical practice by testing its performance during in-vivo forearm ultrasound examinations.