SparseFocus: Learning-based One-shot Autofocus for Microscopy with Sparse Content

📅 2025-02-10
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Conventional autofocus methods—including hill-climbing algorithms and state-of-the-art learning-based approaches—fail under sparse-content microscopy imaging scenarios, where informative regions are scarce and spatially isolated. Method: We propose the first sparsity-aware single-shot autofocus framework, comprising an end-to-end trainable two-stage deep network: (1) a content-importance heatmap prediction stage to localize salient regions; and (2) a sparse-aware defocus regression stage that estimates defocus distance exclusively from those regions, incorporating a sparsity-aware regression head for enhanced robustness. Contribution/Results: We introduce the first large-scale, densely annotated dataset covering dense, sparse, and extremely sparse microscopy scenes, and establish the first content-importance-guided focusing paradigm. Extensive experiments demonstrate significant performance gains over both classical and learning-based baselines across all sparsity levels. The method has been successfully deployed in whole-slide imaging (WSI) systems, enabling real-time, high-precision autofocus.

Technology Category

Application Category

📝 Abstract
Autofocus is necessary for high-throughput and real-time scanning in microscopic imaging. Traditional methods rely on complex hardware or iterative hill-climbing algorithms. Recent learning-based approaches have demonstrated remarkable efficacy in a one-shot setting, avoiding hardware modifications or iterative mechanical lens adjustments. However, in this paper, we highlight a significant challenge that the richness of image content can significantly affect autofocus performance. When the image content is sparse, previous autofocus methods, whether traditional climbing-hill or learning-based, tend to fail. To tackle this, we propose a content-importance-based solution, named SparseFocus, featuring a novel two-stage pipeline. The first stage measures the importance of regions within the image, while the second stage calculates the defocus distance from selected important regions. To validate our approach and benefit the research community, we collect a large-scale dataset comprising millions of labelled defocused images, encompassing both dense, sparse and extremely sparse scenarios. Experimental results show that SparseFocus surpasses existing methods, effectively handling all levels of content sparsity. Moreover, we integrate SparseFocus into our Whole Slide Imaging (WSI) system that performs well in real-world applications. The code and dataset will be made available upon the publication of this paper.
Problem

Research questions and friction points this paper is trying to address.

Improves autofocus in sparse microscopy images
Introduces two-stage content-based autofocus method
Validates with a large-scale defocused image dataset
Innovation

Methods, ideas, or system contributions that make the work stand out.

Two-stage pipeline
Content-importance-based solution
Large-scale labelled dataset
🔎 Similar Papers
No similar papers found.
Y
Yongping Zhai
College of Advanced Interdisciplinary Studies, National University of Defense Technology, Changsha, 410073, Hunan, China
X
Xiaoxi Fu
College of Advanced Interdisciplinary Studies, National University of Defense Technology, Changsha, 410073, Hunan, China
Qiang Su
Qiang Su
City University of Hong Kong
Data center networkSoftware-Hardware co-design
Jia Hu
Jia Hu
University of Exeter
edge-cloud computingresource optimizationsmart citynetwork securityapplied machine learning
Y
Yake Zhang
College of Advanced Interdisciplinary Studies, National University of Defense Technology, Changsha, 410073, Hunan, China
Y
Yunfeng Zhou
College of Advanced Interdisciplinary Studies, National University of Defense Technology, Changsha, 410073, Hunan, China
Chaofan Zhang
Chaofan Zhang
Institute of Automation, Chinese Academy of Sciences
tactile perception and robots dexterous manipulation
X
Xiao Li
College of Advanced Interdisciplinary Studies, National University of Defense Technology, Changsha, 410073, Hunan, China
W
Wenxin Wang
College of Advanced Interdisciplinary Studies, National University of Defense Technology, Changsha, 410073, Hunan, China
D
Dongdong Wu
Department of Information, Army Medical University, Chongqing, 400042, Chongqing, China
S
Shen Yan
College of System Engineering, National University of Defense Technology, Changsha, 410073, Hunan, China