CoMBO: Conflict Mitigation via Branched Optimization for Class Incremental Segmentation

📅 2025-04-05
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the inherent tension between catastrophic forgetting and insufficient plasticity in class-incremental semantic and panoptic segmentation (CIS), this paper proposes the first branch-cooperative optimization framework. Methodologically, it introduces three key components: (1) a query conflict mitigation module to alleviate objective conflicts between old and new classes; (2) hybrid distillation with half-distillation–half-learning (HDHL) and importance-driven knowledge distillation (IKD), enabling parameter-efficient fine-tuning and match-aware feature-level knowledge transfer; and (3) lightweight class adapters coupled with selective classification probability learning, jointly balancing model capacity and old-class stability. Extensive experiments on multiple CIS benchmarks demonstrate substantial improvements over state-of-the-art methods, achieving superior trade-offs between old-class retention and new-class accuracy.

Technology Category

Application Category

📝 Abstract
Effective Class Incremental Segmentation (CIS) requires simultaneously mitigating catastrophic forgetting and ensuring sufficient plasticity to integrate new classes. The inherent conflict above often leads to a back-and-forth, which turns the objective into finding the balance between the performance of previous~(old) and incremental~(new) classes. To address this conflict, we introduce a novel approach, Conflict Mitigation via Branched Optimization~(CoMBO). Within this approach, we present the Query Conflict Reduction module, designed to explicitly refine queries for new classes through lightweight, class-specific adapters. This module provides an additional branch for the acquisition of new classes while preserving the original queries for distillation. Moreover, we develop two strategies to further mitigate the conflict following the branched structure, extit{i.e.}, the Half-Learning Half-Distillation~(HDHL) over classification probabilities, and the Importance-Based Knowledge Distillation~(IKD) over query features. HDHL selectively engages in learning for classification probabilities of queries that match the ground truth of new classes, while aligning unmatched ones to the corresponding old probabilities, thus ensuring retention of old knowledge while absorbing new classes via learning negative samples. Meanwhile, IKD assesses the importance of queries based on their matching degree to old classes, prioritizing the distillation of important features and allowing less critical features to evolve. Extensive experiments in Class Incremental Panoptic and Semantic Segmentation settings have demonstrated the superior performance of CoMBO. Project page: https://guangyu-ryan.github.io/CoMBO.
Problem

Research questions and friction points this paper is trying to address.

Balancing old and new class performance in incremental segmentation
Mitigating catastrophic forgetting while integrating new classes
Refining queries for new classes without losing old knowledge
Innovation

Methods, ideas, or system contributions that make the work stand out.

Branched optimization for conflict mitigation
Query Conflict Reduction with lightweight adapters
Half-Learning Half-Distillation and Importance-Based Knowledge Distillation
🔎 Similar Papers
No similar papers found.