Annotation-Free Class-Incremental Learning

πŸ“… 2025-11-24
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
Existing continual learning (CL) paradigms assume continuous availability of labeled dataβ€”a unrealistic assumption in real-world streaming scenarios. To address this, the paper proposes *Unsupervised Class-Incremental Learning* (AFCIL), a novel CL paradigm where classes emerge sequentially, unlabeled data arrive in a stream, and no labels are ever provided. Method: The authors introduce *CrossWorld CL*, a framework that leverages external world knowledge as semantic priors to mitigate catastrophic forgetting and enable unsupervised novel-class discovery. It integrates cross-domain alignment, ImageNet-based semantic retrieval, knowledge-guided feature mapping, and a novel label-free replay mechanism. Contribution/Results: This is the first method achieving fully label-free class-incremental learning. It significantly outperforms CLIP and state-of-the-art continual learning approaches on four standard benchmarks, demonstrating the efficacy and generalizability of harnessing world knowledge for unsupervised continual learning.

Technology Category

Application Category

πŸ“ Abstract
Despite significant progress in continual learning ranging from architectural novelty to clever strategies for mitigating catastrophic forgetting most existing methods rest on a strong but unrealistic assumption the availability of labeled data throughout the learning process. In real-world scenarios, however, data often arrives sequentially and without annotations, rendering conventional approaches impractical. In this work, we revisit the fundamental assumptions of continual learning and ask: Can current systems adapt when labels are absent and tasks emerge incrementally over time? To this end, we introduce Annotation-Free Class-Incremental Learning (AFCIL), a more realistic and challenging paradigm where unlabeled data arrives continuously, and the learner must incrementally acquire new classes without any supervision. To enable effective learning under AFCIL, we propose CrossWorld CL, a Cross Domain World Guided Continual Learning framework that incorporates external world knowledge as a stable auxiliary source. The method retrieves semantically related ImageNet classes for each downstream category, maps downstream and ImageNet features through a cross domain alignment strategy and finally introduce a novel replay strategy. This design lets the model uncover semantic structure without annotations while keeping earlier knowledge intact. Across four datasets, CrossWorld-CL surpasses CLIP baselines and existing continual and unlabeled learning methods, underscoring the benefit of world knowledge for annotation free continual learning.
Problem

Research questions and friction points this paper is trying to address.

Addresses continual learning without labeled data availability
Enables incremental class acquisition from unlabeled sequential data
Mitigates catastrophic forgetting through external knowledge integration
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses external world knowledge as auxiliary source
Maps downstream and ImageNet features via cross-domain alignment
Introduces novel replay strategy to preserve earlier knowledge
πŸ”Ž Similar Papers
No similar papers found.