Don't Look Back in Anger: MAGIC Net for Streaming Continual Learning with Temporal Dependence

📅 2025-12-08
🏛️ BigData Congress [Services Society]
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work proposes MAGIC Net, an online learning framework that integrates recurrent neural networks with continual learning mechanisms to address three key challenges in data streams: concept drift, temporal dependencies, and catastrophic forgetting. MAGIC Net introduces, for the first time in streaming continual learning, a learnable mask combined with dynamic architecture expansion. The learnable mask enables backward masking to freeze selected weights, thereby preserving previously acquired knowledge, while the network dynamically expands its structure to accommodate emerging concepts. Experimental results demonstrate that MAGIC Net significantly enhances adaptation to new concepts on both synthetic and real-world data streams, effectively mitigates catastrophic forgetting, and simultaneously reduces memory overhead.

Technology Category

Application Category

📝 Abstract
Concept drift, temporal dependence, and catastrophic forgetting represent major challenges when learning from data streams. While Streaming Machine Learning and Continual Learning (CL) address these issues separately, recent efforts in Streaming Continual Learning (SCL) aim to unify them. In this work, we introduce MAGIC Net, a novel SCL approach that integrates CL-inspired architectural strategies with recurrent neural networks to tame temporal dependence. MAGIC Net continuously learns, looks back at past knowledge by applying learnable masks over frozen weights, and expands its architecture when necessary. It performs all operations online, ensuring inference availability at all times. Experiments on synthetic and real-world streams show that it improves adaptation to new concepts, limits memory usage, and mitigates forgetting.
Problem

Research questions and friction points this paper is trying to address.

concept drift
temporal dependence
catastrophic forgetting
streaming continual learning
Innovation

Methods, ideas, or system contributions that make the work stand out.

Streaming Continual Learning
Temporal Dependence
Catastrophic Forgetting
Learnable Masks
Online Learning
🔎 Similar Papers
No similar papers found.
F
Federico Giannini
DEIB - Politecnico di Milano
S
Sandro D'Andrea
DEIB - Politecnico di Milano
Emanuele Della Valle
Emanuele Della Valle
Politecnico di Milano
semantic webstream processingdata streamsconcept driftbig data