MoTiC: Momentum Tightness and Contrast for Few-Shot Class-Incremental Learning

📅 2025-09-23
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
FSCIL faces dual challenges: substantial estimation bias in novel-class prototypes and loose intra-class feature dispersion for base classes. To address these, we propose a momentum-based compact contrastive learning framework. First, we introduce momentum-driven self-supervision coupled with virtual class generation to construct a highly cohesive feature space, mitigating feature dispersion. Second, we align novel-class prototypes via Bayesian prior regularization, with theoretical analysis proving its effectiveness in significantly reducing prototype estimation variance. Third, we integrate backbone freezing, class-mean prototype initialization, and large-scale cross-class contrastive learning—enhancing prototype robustness without additional parameters. Our method achieves state-of-the-art performance on three standard FSCIL benchmarks, notably demonstrating substantial improvements in incremental learning stability and generalization on the fine-grained CUB-200 dataset.

Technology Category

Application Category

📝 Abstract
Few-Shot Class-Incremental Learning (FSCIL) must contend with the dual challenge of learning new classes from scarce samples while preserving old class knowledge. Existing methods use the frozen feature extractor and class-averaged prototypes to mitigate against catastrophic forgetting and overfitting. However, new-class prototypes suffer significant estimation bias due to extreme data scarcity, whereas base-class prototypes benefit from sufficient data. In this work, we theoretically demonstrate that aligning the new-class priors with old-class statistics via Bayesian analysis reduces variance and improves prototype accuracy. Furthermore, we propose large-scale contrastive learning to enforce cross-category feature tightness. To further enrich feature diversity and inject prior information for new-class prototypes, we integrate momentum self-supervision and virtual categories into the Momentum Tightness and Contrast framework (MoTiC), constructing a feature space with rich representations and enhanced interclass cohesion. Experiments on three FSCIL benchmarks produce state-of-the-art performances, particularly on the fine-grained task CUB-200, validating our method's ability to reduce estimation bias and improve incremental learning robustness.
Problem

Research questions and friction points this paper is trying to address.

Reducing estimation bias in new-class prototypes due to scarce samples
Improving prototype accuracy by aligning priors with old-class statistics
Enhancing feature tightness and diversity to prevent catastrophic forgetting
Innovation

Methods, ideas, or system contributions that make the work stand out.

Aligning new-class priors with old-class statistics via Bayesian analysis
Using large-scale contrastive learning for cross-category feature tightness
Integrating momentum self-supervision and virtual categories for feature diversity
🔎 Similar Papers
No similar papers found.
Zeyu He
Zeyu He
Ph.D. Student, Penn State University
Natural Language ProcessingHCICrowdsourcing
S
Shuai Huang
School of Artificial Intelligence, South China Normal University, Foshan, China
Y
Yuwu Lu
School of Artificial Intelligence, South China Normal University, Foshan, China
M
Ming Zhao
School of Internet of Things Engineering, Wuxi University, Wuxi, China