Robotic Ultrasound Makes CBCT Alive

📅 2026-03-10
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the limitation of static cone-beam computed tomography (CBCT) in intraoperative navigation, which fails to capture real-time soft tissue deformations caused by respiration, probe pressure, and surgical manipulation, leading to navigation inaccuracies. To overcome this, the authors propose a deformation-aware CBCT update framework driven by robotic ultrasound. Leveraging a lightweight USCorUNet architecture and optical flow–guided supervised learning, the method infers tissue motion from dynamic ultrasound proxies and updates CBCT slices in real time without additional radiation exposure. The framework integrates LC2-based rigid registration for initialization, dense deformation field estimation, spatial regularization, and multimodal alignment to enable end-to-end, physically plausible dynamic CBCT visualization. This approach significantly enhances the accuracy of dynamic navigation in robot-assisted ultrasound-guided interventional procedures.

Technology Category

Application Category

📝 Abstract
Intraoperative Cone Beam Computed Tomography (CBCT) provides a reliable 3D anatomical context essential for interventional planning. However, its static nature fails to provide continuous monitoring of soft-tissue deformations induced by respiration, probe pressure, and surgical manipulation, leading to navigation discrepancies. We propose a deformation-aware CBCT updating framework that leverages robotic ultrasound as a dynamic proxy to infer tissue motion and update static CBCT slices in real time. Starting from calibration-initialized alignment with linear correlation of linear combination (LC2)-based rigid refinement, our method establishes accurate multimodal correspondence. To capture intraoperative dynamics, we introduce the ultrasound correlation UNet (USCorUNet), a lightweight network trained with optical flow-guided supervision to learn deformation-aware correlation representations, enabling accurate, real-time dense deformation field estimation from ultrasound streams. The inferred deformation is spatially regularized and transferred to the CBCT reference to produce deformation-consistent visualizations without repeated radiation exposure. We validate the proposed approach through deformation estimation and ultrasound-guided CBCT updating experiments. Results demonstrate real-time end-to-end CBCT slice updating and physically plausible deformation estimation, enabling dynamic refinement of static CBCT guidance during robotic ultrasound-assisted interventions. The source code is publicly available at https://github.com/anonymous-codebase/us-cbct-demo.
Problem

Research questions and friction points this paper is trying to address.

CBCT
soft-tissue deformation
intraoperative imaging
navigation discrepancy
real-time monitoring
Innovation

Methods, ideas, or system contributions that make the work stand out.

deformation-aware CBCT updating
robotic ultrasound
USCorUNet
real-time dense deformation estimation
multimodal image registration
🔎 Similar Papers
No similar papers found.