ConCM: Consistency-Driven Calibration and Matching for Few-Shot Class-Incremental Learning

๐Ÿ“… 2025-06-24
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
Few-Shot Class-Incremental Learning (FSCIL) suffers from knowledge conflict between adapting to novel classes and retaining previously learned knowledge. Existing forward-looking space construction methods are constrained by prototype bias and structural rigidity, resulting in suboptimal embedding expressiveness. To address this, we propose a consistency-driven calibration and matching framework. First, we introduce a memory-aware prototype calibration mechanism to enhance conceptual consistency of class centroids in the feature space. Second, we design a dynamic structural matching module that adaptively aligns each incremental session with its optimal manifold spaceโ€”without requiring pre-specified numbers of classes. Inspired by hippocampal associative memory, our method jointly optimizes both feature-level and geometric-structural consistency. Extensive experiments on mini-ImageNet and CUB200 demonstrate state-of-the-art performance, achieving absolute improvements of +3.20% and +3.68% in incremental session harmonic accuracy, respectively.

Technology Category

Application Category

๐Ÿ“ Abstract
Few-Shot Class-Incremental Learning (FSCIL) requires models to adapt to novel classes with limited supervision while preserving learned knowledge. Existing prospective learning-based space construction methods reserve space to accommodate novel classes. However, prototype deviation and structure fixity limit the expressiveness of the embedding space. In contrast to fixed space reservation, we explore the optimization of feature-structure dual consistency and propose a Consistency-driven Calibration and Matching Framework (ConCM) that systematically mitigate the knowledge conflict inherent in FSCIL. Specifically, inspired by hippocampal associative memory, we design a memory-aware prototype calibration that extracts generalized semantic attributes from base classes and reintegrates them into novel classes to enhance the conceptual center consistency of features. Further, we propose dynamic structure matching, which adaptively aligns the calibrated features to a session-specific optimal manifold space, ensuring cross-session structure consistency. Theoretical analysis shows that our method satisfies both geometric optimality and maximum matching, thereby overcoming the need for class-number priors. On large-scale FSCIL benchmarks including mini-ImageNet and CUB200, ConCM achieves state-of-the-art performance, surpassing current optimal method by 3.20% and 3.68% in harmonic accuracy of incremental sessions.
Problem

Research questions and friction points this paper is trying to address.

Mitigates prototype deviation in few-shot incremental learning
Enhances feature consistency via memory-aware prototype calibration
Ensures structure consistency with dynamic session-specific matching
Innovation

Methods, ideas, or system contributions that make the work stand out.

Memory-aware prototype calibration for feature consistency
Dynamic structure matching for manifold alignment
Geometric optimality and maximum matching theory
๐Ÿ”Ž Similar Papers
Q
Qinzhe Wang
School of Automation, Central South University, China
Z
Zixuan Chen
School of Automation, Central South University, China
Keke Huang
Keke Huang
University of British Columbia
DatabasesGraph Neural NetworksLarge Language Models
X
Xiu Su
Big Data Institute, Central South University, China
Chunhua Yang
Chunhua Yang
Central South University
control and optimization of industrial processesdynamics of systems
C
Chang Xu
School of Computer Science, Faculty of Engineering, The University of Sydney, Australia