FLIM Networks with Bag of Feature Points

πŸ“… 2026-02-24
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
This work proposes FLIM-BoFP, a novel approach to address the high annotation dependency and training cost of convolutional neural networks in parasitic microscopy image detection. Departing from conventional layer-wise clustering strategies, FLIM-BoFP introduces a global Bag of Feature Points generated via one-time clustering at the input stage, which is then cross-layer mapped to each encoder layer to directly construct filters without backpropagation. Integrating an adaptive decoder with a clustering-driven encoder design, the method substantially enhances both the speed of filter estimation and the controllability of feature localization. Experimental results demonstrate that FLIM-BoFP outperforms FLIM-Cluster and current state-of-the-art methods in efficiency, accuracy, and generalization on parasitic optical microscopy image detection tasks.

Technology Category

Application Category

πŸ“ Abstract
Convolutional networks require extensive image annotation, which can be costly and time-consuming. Feature Learning from Image Markers (FLIM) tackles this challenge by estimating encoder filters (i.e., kernel weights) from user-drawn markers on discriminative regions of a few representative images without traditional optimization. Such an encoder combined with an adaptive decoder comprises a FLIM network fully trained without backpropagation. Prior research has demonstrated their effectiveness in Salient Object Detection (SOD), being significantly lighter than existing lightweight models. This study revisits FLIM SOD and introduces FLIM-Bag of Feature Points (FLIM-BoFP), a considerably faster filter estimation method. The previous approach, FLIM-Cluster, derives filters through patch clustering at each encoder's block, leading to computational overhead and reduced control over filter locations. FLIM-BoFP streamlines this process by performing a single clustering at the input block, creating a bag of feature points, and defining filters directly from mapped feature points across all blocks. The paper evaluates the benefits in efficiency, effectiveness, and generalization of FLIM-BoFP compared to FLIM-Cluster and other state-of-the-art baselines for parasite detection in optical microscopy images.
Problem

Research questions and friction points this paper is trying to address.

Salient Object Detection
Few-shot Learning
Filter Estimation
Microscopy Image Analysis
Annotation-efficient Learning
Innovation

Methods, ideas, or system contributions that make the work stand out.

FLIM
Bag of Feature Points
filter estimation
Salient Object Detection
no backpropagation
πŸ”Ž Similar Papers
No similar papers found.
J
JoΓ£o Deltregia Martinelli
Institute of Computing, UNICAMP, Campinas, Brazil
M
Marcelo Luis Rodrigues Filho
Institute of Computing, UNICAMP, Campinas, Brazil
Felipe Crispim da Rocha Salvagnini
Felipe Crispim da Rocha Salvagnini
Msc Student, UNICAMP - Institute of Computing
Computer Vision and Pattern Recognition
G
Gilson Junior Soares
Institute of Computing, UNICAMP, Campinas, Brazil
Jefersson A. dos Santos
Jefersson A. dos Santos
University of Sheffield - School of Computer Science
Computer VisionMachine LearningRemote SensingGeoAI
A
Alexandre X. FalcΓ£o
Institute of Computing, UNICAMP, Campinas, Brazil