Forget Less by Learning Together through Concept Consolidation

📅 2026-01-05
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of catastrophic forgetting in custom diffusion models when continually learning new concepts, a problem exacerbated by existing approaches that typically assume a fixed learning sequence and overlook inter-concept interactions. To overcome this limitation, we propose FL2T, a novel framework that, for the first time, enables order-agnostic concurrent learning of multiple concepts. FL2T integrates a set-invariant cross-concept learning module with a proxy-guided feature selection mechanism to effectively preserve and transfer knowledge across tasks. Experimental results on three benchmark datasets demonstrate that FL2T significantly outperforms baseline methods, achieving an average improvement of at least 2% in CLIP image-alignment scores over ten-task incremental learning scenarios, thereby effectively mitigating catastrophic forgetting and enhancing concept retention.

Technology Category

Application Category

📝 Abstract
Custom Diffusion Models (CDMs) have gained significant attention due to their remarkable ability to personalize generative processes. However, existing CDMs suffer from catastrophic forgetting when continuously learning new concepts. Most prior works attempt to mitigate this issue under the sequential learning setting with a fixed order of concept inflow and neglect inter-concept interactions. In this paper, we propose a novel framework - Forget Less by Learning Together (FL2T) - that enables concurrent and order-agnostic concept learning while addressing catastrophic forgetting. Specifically, we introduce a set-invariant inter-concept learning module where proxies guide feature selection across concepts, facilitating improved knowledge retention and transfer. By leveraging inter-concept guidance, our approach preserves old concepts while efficiently incorporating new ones. Extensive experiments, across three datasets, demonstrates that our method significantly improves concept retention and mitigates catastrophic forgetting, highlighting the effectiveness of inter-concept catalytic behavior in incremental concept learning of ten tasks with at least 2% gain on average CLIP Image Alignment scores.
Problem

Research questions and friction points this paper is trying to address.

catastrophic forgetting
custom diffusion models
concept learning
inter-concept interaction
incremental learning
Innovation

Methods, ideas, or system contributions that make the work stand out.

concept consolidation
catastrophic forgetting
custom diffusion models
inter-concept learning
incremental learning
🔎 Similar Papers
2024-08-21arXiv.orgCitations: 1