CLeAN: Continual Learning Adaptive Normalization in Dynamic Environments

📅 2026-03-18
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge posed by temporal dynamics and distribution shifts in continual learning, where conventional normalization methods—relying on global data statistics—struggle to adapt. To overcome this limitation, we propose CLeAN, an adaptive normalization method tailored for tabular data that, for the first time, integrates learnable normalization parameters with an exponential moving average (EMA) mechanism into continual learning. CLeAN dynamically estimates and updates feature scales in response to incoming data streams. It seamlessly integrates with mainstream replay- and regularization-based strategies such as Reservoir Replay, A-GEM, and EwC. Experiments on two benchmark datasets demonstrate that CLeAN significantly mitigates catastrophic forgetting while simultaneously improving performance on new tasks, thereby enhancing model stability and adaptability in dynamic environments.

Technology Category

Application Category

📝 Abstract
Artificial intelligence systems predominantly rely on static data distributions, making them ineffective in dynamic real-world environments, such as cybersecurity, autonomous transportation, or finance, where data shifts frequently. Continual learning offers a potential solution by enabling models to learn from sequential data while retaining prior knowledge. However, a critical and underexplored issue in this domain is data normalization. Conventional normalization methods, such as min-max scaling, presuppose access to the entire dataset, which is incongruent with the sequential nature of continual learning. In this paper we introduce Continual Learning Adaptive Normalization (CLeAN), a novel adaptive normalization technique designed for continual learning in tabular data. CLeAN involves the estimation of global feature scales using learnable parameters that are updated via an Exponential Moving Average (EMA) module, enabling the model to adapt to evolving data distributions. Through comprehensive evaluations on two datasets and various continual learning strategies, including Resevoir Experience Replay, A-GEM, and EwC we demonstrate that CLeAN not only improves model performance on new data but also mitigates catastrophic forgetting. The findings underscore the importance of adaptive normalization in enhancing the stability and effectiveness of tabular data, offering a novel perspective on the use of normalization to preserve knowledge in dynamic learning environments.
Problem

Research questions and friction points this paper is trying to address.

continual learning
data normalization
dynamic environments
catastrophic forgetting
tabular data
Innovation

Methods, ideas, or system contributions that make the work stand out.

Continual Learning
Adaptive Normalization
Exponential Moving Average
Tabular Data
Catastrophic Forgetting
🔎 Similar Papers
No similar papers found.
I
Isabella Marasco
Department of Computer Science and Engineering, University of Bologna, Bologna, 40126, Italy
Davide Evangelista
Davide Evangelista
University of Bologna
Deep LearningMedical ImagingInverse Problems
E
Elena Loli Piccolomini
Department of Computer Science and Engineering, University of Bologna, Bologna, 40126, Italy
M
Michele Colajanni
Department of Computer Science and Engineering, University of Bologna, Bologna, 40126, Italy