PPSEBM: An Energy-Based Model with Progressive Parameter Selection for Continual Learning

๐Ÿ“… 2025-12-17
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
To address catastrophic forgetting in continual learning, this paper proposes EBM-PPS: an Energy-Based Model (EBM) generates high-fidelity pseudo-samples from historical tasks, which guide a Progressive Parameter Selection (PPS) mechanism to dynamically allocate task-specific parameter subsetsโ€”enabling replay-free, low-redundancy knowledge retention. Crucially, EBM-driven pseudo-sample generation is tightly integrated with fine-grained parameter isolation for the first time, eliminating the need for explicit storage of past data or model capacity expansion. Evaluated on multiple NLP continual learning benchmarks, EBM-PPS outperforms state-of-the-art methods, achieving average accuracy gains of 3.2โ€“5.7 percentage points. It effectively mitigates forgetting while enhancing joint optimization across both old and new tasks.

Technology Category

Application Category

๐Ÿ“ Abstract
Continual learning remains a fundamental challenge in machine learning, requiring models to learn from a stream of tasks without forgetting previously acquired knowledge. A major obstacle in this setting is catastrophic forgetting, where performance on earlier tasks degrades as new tasks are learned. In this paper, we introduce PPSEBM, a novel framework that integrates an Energy-Based Model (EBM) with Progressive Parameter Selection (PPS) to effectively address catastrophic forgetting in continual learning for natural language processing tasks. In PPSEBM, progressive parameter selection allocates distinct, task-specific parameters for each new task, while the EBM generates representative pseudo-samples from prior tasks. These generated samples actively inform and guide the parameter selection process, enhancing the model's ability to retain past knowledge while adapting to new tasks. Experimental results on diverse NLP benchmarks demonstrate that PPSEBM outperforms state-of-the-art continual learning methods, offering a promising and robust solution to mitigate catastrophic forgetting.
Problem

Research questions and friction points this paper is trying to address.

Addresses catastrophic forgetting in continual learning
Integrates Energy-Based Model with Progressive Parameter Selection
Enhances knowledge retention while adapting to new tasks
Innovation

Methods, ideas, or system contributions that make the work stand out.

Energy-Based Model generates pseudo-samples from prior tasks
Progressive Parameter Selection allocates task-specific parameters for new tasks
Generated samples guide parameter selection to retain past knowledge
๐Ÿ”Ž Similar Papers
No similar papers found.