One Adapter for All: Towards Unified Representation in Step-Imbalanced Class-Incremental Learning

📅 2026-03-10
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge in class-incremental learning where imbalanced task step sizes cause large tasks to dominate and small tasks to suffer from unstable updates. To mitigate this issue, the authors propose One-A, a unified imbalance-aware framework that integrates dynamically sized incremental tasks into a single adapter while maintaining constant inference overhead. One-A achieves a balance between stability and plasticity through asymmetric subspace alignment, information-adaptive weighting, and a singular vector direction gating mechanism. Experimental results demonstrate that One-A achieves competitive accuracy across multiple benchmarks and imbalanced task streams, while significantly reducing inference costs compared to existing approaches.

Technology Category

Application Category

📝 Abstract
Class-incremental learning (CIL) aims to acquire new classes over time while retaining prior knowledge, yet most setups and methods assume balanced task streams. In practice, the number of classes per task often varies significantly. We refer to this as step imbalance, where large tasks that contain more classes dominate learning and small tasks inject unstable updates. Existing CIL methods assume balanced tasks and therefore treat all tasks uniformly, producing imbalanced updates that degrade overall learning performance. To address this challenge, we propose One-A, a unified and imbalance-aware framework that incrementally merges task updates into a single adapter, maintaining constant inference cost. One-A performs asymmetric subspace alignment to preserve dominant subspaces learned from large tasks while constraining low-information updates within them. An information-adaptive weighting balances the contribution between base and new adapters, and a directional gating mechanism selectively fuses updates along each singular direction, maintaining stability in head directions and plasticity in tail ones. Across multiple benchmarks and step-imbalanced streams, One-A achieves competitive accuracy with significantly low inference overhead, showing that a single, asymmetrically fused adapter can remain both adaptive to dynamic task sizes and efficient at deployment.
Problem

Research questions and friction points this paper is trying to address.

class-incremental learning
step imbalance
task imbalance
unbalanced tasks
incremental learning
Innovation

Methods, ideas, or system contributions that make the work stand out.

step-imbalanced class-incremental learning
unified adapter
asymmetric subspace alignment
information-adaptive weighting
directional gating
🔎 Similar Papers
2024-03-29Computer Vision and Pattern RecognitionCitations: 25
X
Xiaoyan Zhang
University of Michigan
Jiangpeng He
Jiangpeng He
Purdue University
Computer VisionDeep Learning