🤖 AI Summary
This work addresses the challenge posed by temporal dynamics and distribution shifts in continual learning, where conventional normalization methods—relying on global data statistics—struggle to adapt. To overcome this limitation, we propose CLeAN, an adaptive normalization method tailored for tabular data that, for the first time, integrates learnable normalization parameters with an exponential moving average (EMA) mechanism into continual learning. CLeAN dynamically estimates and updates feature scales in response to incoming data streams. It seamlessly integrates with mainstream replay- and regularization-based strategies such as Reservoir Replay, A-GEM, and EwC. Experiments on two benchmark datasets demonstrate that CLeAN significantly mitigates catastrophic forgetting while simultaneously improving performance on new tasks, thereby enhancing model stability and adaptability in dynamic environments.
📝 Abstract
Artificial intelligence systems predominantly rely on static data distributions, making them ineffective in dynamic real-world environments, such as cybersecurity, autonomous transportation, or finance, where data shifts frequently. Continual learning offers a potential solution by enabling models to learn from sequential data while retaining prior knowledge. However, a critical and underexplored issue in this domain is data normalization. Conventional normalization methods, such as min-max scaling, presuppose access to the entire dataset, which is incongruent with the sequential nature of continual learning. In this paper we introduce Continual Learning Adaptive Normalization (CLeAN), a novel adaptive normalization technique designed for continual learning in tabular data. CLeAN involves the estimation of global feature scales using learnable parameters that are updated via an Exponential Moving Average (EMA) module, enabling the model to adapt to evolving data distributions. Through comprehensive evaluations on two datasets and various continual learning strategies, including Resevoir Experience Replay, A-GEM, and EwC we demonstrate that CLeAN not only improves model performance on new data but also mitigates catastrophic forgetting. The findings underscore the importance of adaptive normalization in enhancing the stability and effectiveness of tabular data, offering a novel perspective on the use of normalization to preserve knowledge in dynamic learning environments.