๐ค AI Summary
To address catastrophic forgetting and inefficient cross-task knowledge transfer in pre-trained models (PTMs) under continual learning, this work proposes a modular, plug-and-play continual learning toolbox architecture that unifies support for multiple paradigms and enables seamless evaluation. Methodologically, it tightly integrates Elastic Weight Consolidation (EWC), experience replay, and lightweight prompt tuning within the Hugging Face ecosystem, balancing stability and adaptability. Evaluated on 12 standard streaming benchmarks, the approach achieves an average accuracy gain of 5.2%, reduces forgetting by 37%, and significantly accelerates adaptation to new tasks. The core contribution is the first open-source, scalable, and multi-paradigm-compatible continual learning framework specifically designed for PTMsโproviding a systematic solution for model evolution in streaming-data scenarios.