🤖 AI Summary
In ultrasound-guided needle insertion, conventional needle detection suffers from poor robustness due to speckle noise, artifacts, and low needle visibility. To address this, we propose a vibration-enhanced end-to-end needle detection framework. Our approach uniquely introduces periodic micro-vibration into the ultrasound intervention workflow to elicit needle-specific motion cues. We further pioneer the integration of a differentiable neural short-time Fourier transform (STFT) with the Hough transform to extract discriminative motion features directly in the frequency domain—thereby circumventing intensity-domain interference. The method jointly models vibration signals and learns spatiotemporal features. Evaluated on ex vivo porcine tissue, it achieves a needle tip localization error of 1.61 ± 1.56 mm (an ~80% reduction versus U-Net) and an orientation error of 1.64 ± 1.86°, demonstrating substantial improvements in detection accuracy and reliability under low-visibility conditions.
📝 Abstract
Precise percutaneous needle detection is crucial for ultrasound (US)-guided interventions. However, inherent limitations such as speckles, needle-like artifacts, and low resolution make it challenging to robustly detect needles, especially when their visibility is reduced or imperceptible. To address this challenge, we propose VibNet, a learning-based framework designed to enhance the robustness and accuracy of needle detection in US images by leveraging periodic vibration applied externally to the needle shafts. VibNet integrates neural Short-Time Fourier Transform and Hough Transform modules to achieve successive sub-goals, including motion feature extraction in the spatiotemporal space, frequency feature aggregation, and needle detection in the Hough space. Due to the periodic subtle vibration, the features are more robust in the frequency domain than in the image intensity domain, making VibNet more effective than traditional intensity-based methods. To demonstrate the effectiveness of VibNet, we conducted experiments on distinct extit{ex vivo} porcine and bovine tissue samples. The results obtained on porcine samples demonstrate that VibNet effectively detects needles even when their visibility is severely reduced, with a tip error of $1.61pm1.56~mm$ compared to $8.15pm9.98~mm$ for UNet and $6.63pm7.58~mm$ for WNet, and a needle direction error of $1.64pm1.86^{circ}$ compared to $9.29pm15.30^{circ}$ for UNet and $8.54pm17.92^{circ}$ for WNet. Code: https://github.com/marslicy/VibNet.