🤖 AI Summary
This work addresses the challenges posed by temporal, spatial, and conditional output distribution shifts in time series forecasting, which commonly undermine the effectiveness of normalization methods. For the first time, we systematically analyze the mechanism of reversible instance normalization (RevIN) and, through ablation studies, reveal that certain components are redundant or even detrimental to performance. Building on these insights, we propose a refined perspective that clearly distinguishes essential from non-essential elements of RevIN, leading to substantially improved model robustness and generalization. Our findings provide a principled foundation for designing more efficient and effective normalization strategies tailored specifically for time series data.
📝 Abstract
Data normalization is a crucial component of deep learning models, yet its role in time series forecasting remains insufficiently understood. In this paper, we identify three central challenges for normalization in time series forecasting: temporal input distribution shift, spatial input distribution shift, and conditional output distribution shift. In this context, we revisit the widely used Reversible Instance Normalization (RevIN), by showing through ablation studies that several of its components are redundant or even detrimental. Based on these observations, we draw new perspectives to improve RevIN's robustness and generalization.