Advancing Fetal Ultrasound Image Quality Assessment in Low-Resource Settings

📅 2025-07-30
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In resource-constrained settings, fetal ultrasound image quality is often suboptimal due to a shortage of trained sonographers, compromising the accuracy of critical biometric measurements such as abdominal circumference. To address this, we propose FetalCLIP$_{CLS}$, a lightweight adapter classifier built upon the vision-language pretrained model FetalCLIP. It employs Low-Rank Adaptation (LoRA) for parameter-efficient fine-tuning and—novelly—demonstrates that transferring segmentation models to image quality classification significantly enhances performance. Trained and evaluated on prospectively acquired blind-scan ultrasound data, FetalCLIP$_{CLS}$ achieves an F1-score of 0.771 on the ACOUSLIC-AI dataset, outperforming six CNN- and Transformer-based baselines by +1.4%. This work establishes a new paradigm for automated, scalable ultrasound quality control in low-resource environments.

Technology Category

Application Category

📝 Abstract
Accurate fetal biometric measurements, such as abdominal circumference, play a vital role in prenatal care. However, obtaining high-quality ultrasound images for these measurements heavily depends on the expertise of sonographers, posing a significant challenge in low-income countries due to the scarcity of trained personnel. To address this issue, we leverage FetalCLIP, a vision-language model pretrained on a curated dataset of over 210,000 fetal ultrasound image-caption pairs, to perform automated fetal ultrasound image quality assessment (IQA) on blind-sweep ultrasound data. We introduce FetalCLIP$_{CLS}$, an IQA model adapted from FetalCLIP using Low-Rank Adaptation (LoRA), and evaluate it on the ACOUSLIC-AI dataset against six CNN and Transformer baselines. FetalCLIP$_{CLS}$ achieves the highest F1 score of 0.757. Moreover, we show that an adapted segmentation model, when repurposed for classification, further improves performance, achieving an F1 score of 0.771. Our work demonstrates how parameter-efficient fine-tuning of fetal ultrasound foundation models can enable task-specific adaptations, advancing prenatal care in resource-limited settings. The experimental code is available at: https://github.com/donglihe-hub/FetalCLIP-IQA.
Problem

Research questions and friction points this paper is trying to address.

Automating fetal ultrasound image quality assessment
Reducing reliance on expert sonographers in low-resource settings
Improving prenatal care with parameter-efficient fine-tuning
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses FetalCLIP for automated ultrasound quality assessment
Applies LoRA for parameter-efficient model adaptation
Repurposes segmentation model to boost classification performance
🔎 Similar Papers
No similar papers found.