DeepBooTS: Dual-Stream Residual Boosting for Drift-Resilient Time-Series Forecasting

📅 2025-11-10
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In time-series forecasting, concept drift-induced non-stationarity severely degrades model robustness. To address this, we propose DRD—an end-to-end dual-stream residual-decay enhancement framework—that pioneers a bias-variance trade-off perspective for modeling concept drift. DRD introduces a block-level residual correction mechanism and a multi-learner ensemble architecture to achieve input-target decoupling and progressive signal reconstruction. Technically, it integrates instance normalization, deep residual networks, and a dual-stream structure, augmented with block-level auxiliary output branches forming a high-speed pathway for layer-wise residual refinement and joint time-frequency domain signal separation. Evaluated on large-scale benchmarks, DRD achieves an average 15.8% improvement in forecasting accuracy, markedly enhancing adaptability to concept drift and model stability. This work establishes a novel paradigm for non-stationary time-series forecasting.

Technology Category

Application Category

📝 Abstract
Time-Series (TS) exhibits pronounced non-stationarity. Consequently, most forecasting methods display compromised robustness to concept drift, despite the prevalent application of instance normalization. We tackle this challenge by first analysing concept drift through a bias-variance lens and proving that weighted ensemble reduces variance without increasing bias. These insights motivate DeepBooTS, a novel end-to-end dual-stream residual-decreasing boosting method that progressively reconstructs the intrinsic signal. In our design, each block of a deep model becomes an ensemble of learners with an auxiliary output branch forming a highway to the final prediction. The block-wise outputs correct the residuals of previous blocks, leading to a learning-driven decomposition of both inputs and targets. This method enhances versatility and interpretability while substantially improving robustness to concept drift. Extensive experiments, including those on large-scale datasets, show that the proposed method outperforms existing methods by a large margin, yielding an average performance improvement of 15.8% across various datasets, establishing a new benchmark for TS forecasting.
Problem

Research questions and friction points this paper is trying to address.

Addressing concept drift in non-stationary time-series forecasting
Reducing prediction residuals through dual-stream boosting architecture
Enhancing robustness and interpretability of time-series models
Innovation

Methods, ideas, or system contributions that make the work stand out.

Dual-stream residual boosting for time-series
Ensemble learners with auxiliary output branches
Progressive intrinsic signal reconstruction via residuals
🔎 Similar Papers
No similar papers found.
Daojun Liang
Daojun Liang
Qilu University of Technology
Machine LearningComputer VisionNatural Language Processing
J
Jing Chen
Key Laboratory of Computing Power Network and Information Security, Ministry of Education, Shandong Computer Science Center (National Supercomputer Center in Jinan), Qilu University of Technology (Shandong Academy of Sciences), Jinan, 250103, China.
X
Xiao Wang
School of Intelligent Manufacturing and Control Engineering, Qilu Institute of Technology, Jinan, 250200, China.
Y
Yinglong Wang
Key Laboratory of Computing Power Network and Information Security, Ministry of Education, Shandong Computer Science Center (National Supercomputer Center in Jinan), Qilu University of Technology (Shandong Academy of Sciences), Jinan, 250103, China.
S
Suo Li
Department of Computer and Data Sciences, Case Western Reserve University, Cleveland, OH 44106, USA.