Cost-Efficient Continual Learning with Sufficient Exemplar Memory

📅 2025-02-11
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the computational efficiency optimization problem in continual learning (CL) under the realistic large-model-era setting where GPU compute is expensive but memory is abundant—contrary to the traditional CL assumption of extremely limited memory budgets. We propose a novel weight-space-oriented paradigm comprising selective weight resetting, cross-task weight averaging, and efficient exemplar replay, enabling joint acceleration of training and improvement of generalization under relaxed memory constraints. Extensive experiments on mainstream CL benchmarks demonstrate that our method achieves state-of-the-art performance while reducing GPU compute cost to only 25–33% of existing approaches. The proposed framework establishes a scalable, cost-effective baseline for practical continual learning, bridging the gap between theoretical CL assumptions and modern hardware realities.

Technology Category

Application Category

📝 Abstract
Continual learning (CL) research typically assumes highly constrained exemplar memory resources. However, in many real-world scenarios-especially in the era of large foundation models-memory is abundant, while GPU computational costs are the primary bottleneck. In this work, we investigate CL in a novel setting where exemplar memory is ample (i.e., sufficient exemplar memory). Unlike prior methods designed for strict exemplar memory constraints, we propose a simple yet effective approach that directly operates in the model's weight space through a combination of weight resetting and averaging techniques. Our method achieves state-of-the-art performance while reducing the computational cost to a quarter or third of existing methods. These findings challenge conventional CL assumptions and provide a practical baseline for computationally efficient CL applications.
Problem

Research questions and friction points this paper is trying to address.

Optimize continual learning computational costs
Utilize ample exemplar memory efficiently
Challenge conventional memory constraints assumptions
Innovation

Methods, ideas, or system contributions that make the work stand out.

Weight resetting techniques
Weight averaging methods
Reduced computational cost
🔎 Similar Papers
No similar papers found.