ConVibNet: Needle Detection during Continuous Insertion via Frequency-Inspired Features

📅 2026-03-01
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study addresses the challenge of poor needle visibility in ultrasound-guided interventions, where low image contrast, occlusions, and artifacts hinder accurate real-time localization. To overcome this, the authors propose ConVibNet, a novel framework that jointly estimates needle tip position and shaft orientation by explicitly modeling temporal dependencies across consecutive ultrasound frames. The method introduces an intersection-difference loss function to enforce temporal consistency of needle tip motion and integrates frequency-inspired features with temporal modeling. Evaluated on a newly curated ultrasound puncture dataset, ConVibNet achieves a needle tip localization error of 2.80 ± 2.42 mm and an angular error of 1.69 ± 2.00°, outperforming the best baseline by 0.75 mm in localization accuracy while maintaining real-time inference capability.

Technology Category

Application Category

📝 Abstract
Purpose: Ultrasound-guided needle interventions are widely used in clinical practice, but their success critically depends on accurate needle placement, which is frequently hindered by the poor and intermittent visibility of needles in ultrasound images. Existing approaches remain limited by artifacts, occlusions, and low contrast, and often fail to support real-time continuous insertion. To overcome these challenges, this study introduces a robust real-time framework for continuous needle detection. Methods: We present ConVibNet, an extension of VibNet for detecting needles with significantly reduced visibility, addressing real-time, continuous needle tracking during insertion. ConVibNet leverages temporal dependencies across successive ultrasound frames to enable continuous estimation of both needle tip position and shaft angle in dynamic scenarios. To strengthen temporal awareness of needle-tip motion, we introduce a novel intersection-and-difference loss that explicitly leverages motion correlations across consecutive frames. In addition, we curated a dedicated dataset for model development and evaluation. Results: The performance of the proposed ConVibNet model was evaluated on our dataset, demonstrating superior accuracy compared to the baseline VibNet and UNet-LSTM models. Specifically, ConVibNet achieved a tip error of 2.80+-2.42 mm and an angle error of 1.69+-2.00 deg. These results represent a 0.75 mm improvement in tip localization accuracy over the best-performing baseline, while preserving real-time inference capability. Conclusion: ConVibNet advances real-time needle detection in ultrasound-guided interventions by integrating temporal correlation modeling with a novel intersection-and-difference loss, thereby improving accuracy and robustness and demonstrating high potential for integration into autonomous insertion systems.
Problem

Research questions and friction points this paper is trying to address.

needle detection
ultrasound guidance
continuous insertion
real-time tracking
poor visibility
Innovation

Methods, ideas, or system contributions that make the work stand out.

ConVibNet
temporal correlation
intersection-and-difference loss
real-time needle tracking
ultrasound-guided intervention
J
Jiamei Guo
Computer Aided Medical Procedures and Augmented Reality (CAMP), Technical University of Munich, Munich, Germany.
Z
Zhehao Duan
Computer Aided Medical Procedures and Augmented Reality (CAMP), Technical University of Munich, Munich, Germany.
M
Maria Neiiendam
Technical University of Denmark, Kongens Lyngby, Denmark.
Dianye Huang
Dianye Huang
Technical University of Munich
robotic ultrasoundmedical robotintelligent controlhuman robot interaction
Nassir Navab
Nassir Navab
Professor of Computer Science, Technische Universität München
Zhongliang Jiang
Zhongliang Jiang
University of Hong Kong
Medical RoboticsUltrasound imagingRobot learningSurgical RoboticsHuman-robot Interaction