IPEC: Test-Time Incremental Prototype Enhancement Classifier for Few-Shot Learning

πŸ“… 2026-01-16
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
This work addresses the instability and performance limitations of few-shot learning during inference, which arise from the common assumption of batch-wise independence that prevents leveraging historical query samples. To overcome this, the authors propose the Incremental Prototype Enhancement Classifier (IPEC), which dynamically constructs an auxiliary set of high-confidence query samples and fuses it with the support set to progressively refine class prototypes. IPEC incorporates a dual-filtering mechanism that balances global confidence and local discriminability, along with a Bayesian-inspired prototype update strategy that treats the support set as a prior and the auxiliary set as likelihood-derived evidence. A two-stage β€œwarm-up–test” inference protocol is introduced to move beyond static prototype representations. Extensive experiments demonstrate that IPEC significantly outperforms existing methods across multiple few-shot classification benchmarks, effectively enhancing both prototype stability and classification accuracy.

Technology Category

Application Category

πŸ“ Abstract
Metric-based few-shot approaches have gained significant popularity due to their relatively straightforward implementation, high interpret ability, and computational efficiency. However, stemming from the batch-independence assumption during testing, which prevents the model from leveraging valuable knowledge accumulated from previous batches. To address these challenges, we propose a novel test-time method called Incremental Prototype Enhancement Classifier (IPEC), a test-time method that optimizes prototype estimation by leveraging information from previous query samples. IPEC maintains a dynamic auxiliary set by selectively incorporating query samples that are classified with high confidence. To ensure sample quality, we design a robust dual-filtering mechanism that assesses each query sample based on both global prediction confidence and local discriminative ability. By aggregating this auxiliary set with the support set in subsequent tasks, IPEC builds progressively more stable and representative prototypes, effectively reducing its reliance on the initial support set. We ground this approach in a Bayesian interpretation, conceptualizing the support set as a prior and the auxiliary set as a data-driven posterior, which in turn motivates the design of a practical"warm-up and test"two-stage inference protocol. Extensive empirical results validate the superior performance of our proposed method across multiple few-shot classification tasks.
Problem

Research questions and friction points this paper is trying to address.

few-shot learning
prototype estimation
test-time adaptation
batch independence
incremental learning
Innovation

Methods, ideas, or system contributions that make the work stand out.

few-shot learning
prototype enhancement
test-time adaptation
incremental learning
Bayesian inference
πŸ”Ž Similar Papers
No similar papers found.
W
Wenwen Liao
College of Intelligent Robotics and Advanced Manufacturing, Fudan University, Shanghai, 201203, China
H
Hang Ruan
College of Intelligent Robotics and Advanced Manufacturing, Fudan University, Shanghai, 201203, China
Jianbo Yu
Jianbo Yu
Professor of School of Mechanical Engineering, Tongji University
Prognostics and Health ManagementCondition-Based MonitoringQuality ControlFault DiagnosisIndustrial Engineering
X
Xiaofeng Yang
School of Microelectronics, Fudan University, Shanghai, 201203, China
Q
Qingchao Jiang
Key Laboratory of Advanced Control and Optimization for Chemical Processes of Ministry of Education, East China University of Science and Technology, Shanghai 200237, China
Xuefeng Yan
Xuefeng Yan
Molecular Imaging Branch/National Institute of Mental Health/National Institutes of Health
Molecular imaging