🤖 AI Summary
This work proposes MAGIC Net, an online learning framework that integrates recurrent neural networks with continual learning mechanisms to address three key challenges in data streams: concept drift, temporal dependencies, and catastrophic forgetting. MAGIC Net introduces, for the first time in streaming continual learning, a learnable mask combined with dynamic architecture expansion. The learnable mask enables backward masking to freeze selected weights, thereby preserving previously acquired knowledge, while the network dynamically expands its structure to accommodate emerging concepts. Experimental results demonstrate that MAGIC Net significantly enhances adaptation to new concepts on both synthetic and real-world data streams, effectively mitigates catastrophic forgetting, and simultaneously reduces memory overhead.
📝 Abstract
Concept drift, temporal dependence, and catastrophic forgetting represent major challenges when learning from data streams. While Streaming Machine Learning and Continual Learning (CL) address these issues separately, recent efforts in Streaming Continual Learning (SCL) aim to unify them. In this work, we introduce MAGIC Net, a novel SCL approach that integrates CL-inspired architectural strategies with recurrent neural networks to tame temporal dependence. MAGIC Net continuously learns, looks back at past knowledge by applying learnable masks over frozen weights, and expands its architecture when necessary. It performs all operations online, ensuring inference availability at all times. Experiments on synthetic and real-world streams show that it improves adaptation to new concepts, limits memory usage, and mitigates forgetting.