cPNN: Continuous Progressive Neural Networks for Evolving Streaming Time Series

📅 2026-03-03
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work proposes continuous Progressive Neural Networks (cPNN) to address the threefold challenges of temporal dependencies, concept drift, and catastrophic forgetting in non-stationary time series data streams. By integrating recurrent neural network architectures with a streaming-oriented stochastic gradient descent strategy, cPNN enables a dynamically expandable network structure that efficiently learns new concepts while preserving historical knowledge. As the first unified framework tackling these interrelated issues simultaneously, cPNN significantly enhances adaptation speed and robustness to distributional shifts without compromising memory of previously learned tasks. Ablation studies further confirm its superior performance in scenarios involving concept drift, demonstrating its effectiveness in maintaining both stability and plasticity in dynamic environments.

Technology Category

Application Category

📝 Abstract
Dealing with an unbounded data stream involves overcoming the assumption that data is identically distributed and independent. A data stream can, in fact, exhibit temporal dependencies (i.e., be a time series), and data can change distribution over time (concept drift). The two problems are deeply discussed, and existing solutions address them separately: a joint solution is absent. In addition, learning multiple concepts implies remembering the past (a.k.a. avoiding catastrophic forgetting in Neural Networks' terminology). This work proposes Continuous Progressive Neural Networks (cPNN), a solution that tames concept drifts, handles temporal dependencies, and bypasses catastrophic forgetting. cPNN is a continuous version of Progressive Neural Networks, a methodology for remembering old concepts and transferring past knowledge to fit the new concepts quickly. We base our method on Recurrent Neural Networks and exploit the Stochastic Gradient Descent applied to data streams with temporal dependencies. Results of an ablation study show a quick adaptation of cPNN to new concepts and robustness to drifts.
Problem

Research questions and friction points this paper is trying to address.

concept drift
temporal dependencies
catastrophic forgetting
streaming time series
neural networks
Innovation

Methods, ideas, or system contributions that make the work stand out.

concept drift
catastrophic forgetting
temporal dependencies
progressive neural networks
streaming time series
🔎 Similar Papers
No similar papers found.
F
Federico Giannini
DEIB, Politecnico di Milano, Milano, Italy
G
Giacomo Ziffer
DEIB, Politecnico di Milano, Milano, Italy
Emanuele Della Valle
Emanuele Della Valle
Politecnico di Milano
semantic webstream processingdata streamsconcept driftbig data