🤖 AI Summary
To address the challenge of scan path planning in intercostal ultrasound imaging caused by limited acoustic windows, this paper proposes a haptic-guided robotic ultrasound scanning path planning method. The approach innovatively employs 1D haptic signals to directly sense subcutaneous rib structures—bypassing conventional ultrasound image segmentation—and enables robust extraction of bone surface point clouds. A sparse haptic point cloud is constructed by integrating robot trajectory data, and interpolation and registration strategies are introduced for subject-specific path mapping. An automatic tilt adjustment mechanism ensures complete coverage of target structures beneath the ribs. The system integrates haptic sensing, trajectory tracking, point cloud processing, and CT-based validation. Experiments on four phantom models yield average path mapping errors of 3.41 mm (mean nearest-neighbor distance) and 3.65 mm (Hausdorff distance), with sub-rib structure reconstruction accuracy of 0.69 mm (mean) and 2.2 mm (maximum).
📝 Abstract
Medical ultrasound (US) imaging is widely used in clinical examinations due to its portability, real-time capability, and radiation-free nature. To address inter- and intra-operator variability, robotic ultrasound systems have gained increasing attention. However, their application in challenging intercostal imaging remains limited due to the lack of an effective scan path generation method within the constrained acoustic window. To overcome this challenge, we explore the potential of tactile cues for characterizing subcutaneous rib structures as an alternative signal for ultrasound segmentation-free bone surface point cloud extraction. Compared to 2D US images, 1D tactile-related signals offer higher processing efficiency and are less susceptible to acoustic noise and artifacts. By leveraging robotic tracking data, a sparse tactile point cloud is generated through a few scans along the rib, mimicking human palpation. To robustly map the scanning trajectory into the intercostal space, the sparse tactile bone location point cloud is first interpolated to form a denser representation. This refined point cloud is then registered to an image-based dense bone surface point cloud, enabling accurate scan path mapping for individual patients. Additionally, to ensure full coverage of the object of interest, we introduce an automated tilt angle adjustment method to visualize structures beneath the bone. To validate the proposed method, we conducted comprehensive experiments on four distinct phantoms. The final scanning waypoint mapping achieved Mean Nearest Neighbor Distance (MNND) and Hausdorff distance (HD) errors of 3.41 mm and 3.65 mm, respectively, while the reconstructed object beneath the bone had errors of 0.69 mm and 2.2 mm compared to the CT ground truth.