Exploring the Tradeoff Between Diversity and Discrimination for Continuous Category Discovery

📅 2025-08-14
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses the novel class discovery (NCD) problem in continual learning under unlabeled streaming data, aiming to reconcile the conflict between discovering emerging classes and preserving accurate classification of previously seen classes—while mitigating error accumulation and catastrophic forgetting. Methodologically, we propose IDOD, a unified framework that integrates independent diversity enhancement, joint novelty detection, and orthogonal prototype incremental updating, thereby reducing conventional multi-stage NCD to a single-stage end-to-end process. IDOD further incorporates contrastive learning–driven independent feature training, orthogonal prototype generation, and representative representation replay to jointly optimize discriminability and diversity. Evaluated on multiple fine-grained benchmarks, IDOD achieves substantial improvements in novel class discovery accuracy (+3.2–7.8%) while reducing memory overhead by up to 42%, outperforming existing state-of-the-art methods.

Technology Category

Application Category

📝 Abstract
Continuous category discovery (CCD) aims to automatically discover novel categories in continuously arriving unlabeled data. This is a challenging problem considering that there is no number of categories and labels in the newly arrived data, while also needing to mitigate catastrophic forgetting. Most CCD methods cannot handle the contradiction between novel class discovery and classification well. They are also prone to accumulate errors in the process of gradually discovering novel classes. Moreover, most of them use knowledge distillation and data replay to prevent forgetting, occupying more storage space. To address these limitations, we propose Independence-based Diversity and Orthogonality-based Discrimination (IDOD). IDOD mainly includes independent enrichment of diversity module, joint discovery of novelty module, and continuous increment by orthogonality module. In independent enrichment, the backbone is trained separately using contrastive loss to avoid it focusing only on features for classification. Joint discovery transforms multi-stage novel class discovery into single-stage, reducing error accumulation impact. Continuous increment by orthogonality module generates mutually orthogonal prototypes for classification and prevents forgetting with lower space overhead via representative representation replay. Experimental results show that on challenging fine-grained datasets, our method outperforms the state-of-the-art methods.
Problem

Research questions and friction points this paper is trying to address.

Balancing diversity and discrimination in continuous category discovery
Reducing error accumulation during novel class discovery
Minimizing storage overhead while preventing catastrophic forgetting
Innovation

Methods, ideas, or system contributions that make the work stand out.

Independent diversity enrichment via contrastive loss
Single-stage joint discovery reducing error accumulation
Orthogonal prototypes for classification with low overhead
🔎 Similar Papers
No similar papers found.