EFC++: Elastic Feature Consolidation with Prototype Re-balancing for Cold Start Exemplar-free Incremental Learning

📅 2025-03-13
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses feature drift and task recency bias in exemplar-free class-incremental learning (EFCIL) under cold-start, zero-shot scenarios, where scarce initial-task data severely impairs representation stability. To tackle these issues, we propose a synergistic framework combining elastic feature regularization and prototype rebalancing. Methodologically: (1) We construct an Experience Feature Matrix (EFM) and impose second-order feature drift regularization on critical directions within a pseudo-metric space; (2) We introduce a novel post-training prototype rebalancing mechanism that dynamically calibrates classifier weights via Gaussian prototype modeling, jointly optimizing plasticity and stability. Extensive experiments on CIFAR-100, ImageNet-1K, and DomainNet demonstrate substantial improvements over state-of-the-art methods—achieving superior novel-class recognition accuracy and significantly enhanced long-term memory retention without exemplars.

Technology Category

Application Category

📝 Abstract
Exemplar-Free Class Incremental Learning (EFCIL) aims to learn from a sequence of tasks without having access to previous task data. In this paper, we consider the challenging Cold Start scenario in which insufficient data is available in the first task to learn a high-quality backbone. This is especially challenging for EFCIL since it requires high plasticity, resulting in feature drift which is difficult to compensate for in the exemplar-free setting. To address this problem, we propose an effective approach to consolidate feature representations by regularizing drift in directions highly relevant to previous tasks and employs prototypes to reduce task-recency bias. Our approach, which we call Elastic Feature Consolidation++ (EFC++) exploits a tractable second-order approximation of feature drift based on a proposed Empirical Feature Matrix (EFM). The EFM induces a pseudo-metric in feature space which we use to regularize feature drift in important directions and to update Gaussian prototypes. In addition, we introduce a post-training prototype re-balancing phase that updates classifiers to compensate for feature drift. Experimental results on CIFAR-100, Tiny-ImageNet, ImageNet-Subset, ImageNet-1K and DomainNet demonstrate that EFC++ is better able to learn new tasks by maintaining model plasticity and significantly outperform the state-of-the-art.
Problem

Research questions and friction points this paper is trying to address.

Addresses Cold Start in Exemplar-Free Incremental Learning
Reduces feature drift and task-recency bias using prototypes
Improves model plasticity and performance on new tasks
Innovation

Methods, ideas, or system contributions that make the work stand out.

Regularizes feature drift using Empirical Feature Matrix
Updates Gaussian prototypes to reduce task-recency bias
Introduces post-training prototype re-balancing phase
🔎 Similar Papers
No similar papers found.