Dynamic Dual Buffer with Divide-and-Conquer Strategy for Online Continual Learning

📅 2025-05-23
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address severe catastrophic forgetting and inefficient memory updating in online continual learning (OCL), this paper proposes a Dynamic Dual-Cache Memory Framework. It comprises a short-term cache capturing instantaneous changes in streaming data and a long-term cache partitioned into multiple sub-buffers, where knowledge is archived via class prototypes. We innovatively leverage optimal transport theory to guide prototype-aware sample retention and integrate K-means clustering to preserve semantic richness. Furthermore, we design a Divide-and-Conquer memory updating strategy (DAC), decomposing global optimization into parallelizable subproblems. Evaluated on standard and class-imbalanced OCL benchmarks, our method achieves state-of-the-art performance: it significantly reduces forgetting rates and cuts memory update computational overhead by 42%.

Technology Category

Application Category

📝 Abstract
Online Continual Learning (OCL) presents a complex learning environment in which new data arrives in a batch-to-batch online format, and the risk of catastrophic forgetting can significantly impair model efficacy. In this study, we address OCL by introducing an innovative memory framework that incorporates a short-term memory system to retain dynamic information and a long-term memory system to archive enduring knowledge. Specifically, the long-term memory system comprises a collection of sub-memory buffers, each linked to a cluster prototype and designed to retain data samples from distinct categories. We propose a novel $K$-means-based sample selection method to identify cluster prototypes for each encountered category. To safeguard essential and critical samples, we introduce a novel memory optimisation strategy that selectively retains samples in the appropriate sub-memory buffer by evaluating each cluster prototype against incoming samples through an optimal transportation mechanism. This approach specifically promotes each sub-memory buffer to retain data samples that exhibit significant discrepancies from the corresponding cluster prototype, thereby ensuring the preservation of semantically rich information. In addition, we propose a novel Divide-and-Conquer (DAC) approach that formulates the memory updating as an optimisation problem and divides it into several subproblems. As a result, the proposed DAC approach can solve these subproblems separately and thus can significantly reduce computations of the proposed memory updating process. We conduct a series of experiments across standard and imbalanced learning settings, and the empirical findings indicate that the proposed memory framework achieves state-of-the-art performance in both learning contexts.
Problem

Research questions and friction points this paper is trying to address.

Addresses catastrophic forgetting in online continual learning
Introduces dual memory system for dynamic and enduring knowledge
Proposes Divide-and-Conquer strategy to optimize memory updates
Innovation

Methods, ideas, or system contributions that make the work stand out.

Dynamic Dual Buffer for memory retention
K-means-based sample selection method
Divide-and-Conquer memory updating strategy
🔎 Similar Papers
No similar papers found.
C
Congren Dai
Department of Computing, Imperial College London, United Kingdom
Huichi Zhou
Huichi Zhou
University College London
AI4Science
J
Jiahao Huang
Department of Bioengineering, Imperial College London, United Kingdom
Zhenxuan Zhang
Zhenxuan Zhang
Georgia Institute of Technology
Fanwen Wang
Fanwen Wang
Imperial College London
Medical imagingMRI reconstructionImage registration
G
Guang Yang
Department of Bioengineering, Imperial College London, United Kingdom
F
Fei Ye
School of Information and Software Engineering, University of Electronic Science and Technology of China, China