Lightweight Channel-wise Dynamic Fusion Model: Non-stationary Time Series Forecasting via Entropy Analysis

📅 2025-03-04
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address global temporal dependency breakdown, channel-wise dynamic information loss, and over-smoothing in non-stationary time series forecasting, this paper proposes the lightweight Channel Dynamic Fusion Model (CDFM). Methodologically, CDFM introduces variance as an interpretable metric of non-stationarity—its first use in this context—and employs a dual-branch predictor to jointly model stationary and non-stationary components. A novel channel selector adaptively recovers critical dynamics by leveraging non-stationarity intensity, inter-channel similarity, and distribution consistency. Furthermore, variance-driven dynamic fusion weights balance normalized predictability with preservation of original dynamic characteristics. Evaluated on seven benchmark datasets, CDFM consistently outperforms state-of-the-art methods, achieving superior trade-offs among prediction accuracy, generalization capability, and computational efficiency. It effectively mitigates over-smoothing, temporal dependency fragmentation, and channel-agnostic modeling.

Technology Category

Application Category

📝 Abstract
Non-stationarity is an intrinsic property of real-world time series and plays a crucial role in time series forecasting. Previous studies primarily adopt instance normalization to attenuate the non-stationarity of original series for better predictability. However, instance normalization that directly removes the inherent non-stationarity can lead to three issues: (1) disrupting global temporal dependencies, (2) ignoring channel-specific differences, and (3) producing over-smoothed predictions. To address these issues, we theoretically demonstrate that variance can be a valid and interpretable proxy for quantifying non-stationarity of time series. Based on the analysis, we propose a novel lightweight extit{C}hannel-wise extit{D}ynamic extit{F}usion extit{M}odel ( extit{CDFM}), which selectively and dynamically recovers intrinsic non-stationarity of the original series, while keeping the predictability of normalized series. First, we design a Dual-Predictor Module, which involves two branches: a Time Stationary Predictor for capturing stable patterns and a Time Non-stationary Predictor for modeling global dynamics patterns. Second, we propose a Fusion Weight Learner to dynamically characterize the intrinsic non-stationary information across different samples based on variance. Finally, we introduce a Channel Selector to selectively recover non-stationary information from specific channels by evaluating their non-stationarity, similarity, and distribution consistency, enabling the model to capture relevant dynamic features and avoid overfitting. Comprehensive experiments on seven time series datasets demonstrate the superiority and generalization capabilities of CDFM.
Problem

Research questions and friction points this paper is trying to address.

Addresses non-stationarity in time series forecasting.
Proposes dynamic fusion model to recover intrinsic non-stationarity.
Improves prediction by capturing global and channel-specific dynamics.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Dual-Predictor Module captures stable and dynamic patterns
Fusion Weight Learner dynamically characterizes non-stationarity
Channel Selector recovers non-stationary information selectively