🤖 AI Summary
Transformers face challenges in time-series forecasting—including sensitivity to noise, difficulty modeling long-range dependencies, and lack of temporal inductive bias. To systematically evaluate these limitations, we benchmark three representative architectures—Autoformer, Informer, and PatchTST—across 10 synthetic signals, 5 patch lengths, and multiple forecast horizons, conducting over 1,500 controlled experiments under both noisy and clean conditions. We propose Deep Koopformer, a Koopman operator–enhanced Transformer architecture that integrates spectral dynamical systems theory into latent-state modeling, thereby improving stability and interpretability for nonlinear and chaotic time series. Experimental results demonstrate that Deep Koopformer reduces average prediction error by 22.7% under noisy and chaotic conditions compared to baseline models. Our study further uncovers universal performance trends across Transformer variants, establishing a principled balance between theoretical rigor and engineering robustness in time-series forecasting.
📝 Abstract
Time series forecasting plays a critical role in domains such as energy, finance, and healthcare, where accurate predictions inform decision-making under uncertainty. Although Transformer-based models have demonstrated success in sequential modeling, their adoption for time series remains limited by challenges such as noise sensitivity, long-range dependencies, and a lack of inductive bias for temporal structure. In this work, we present a unified and principled framework for benchmarking three prominent Transformer forecasting architectures-Autoformer, Informer, and Patchtst-each evaluated through three architectural variants: Minimal, Standard, and Full, representing increasing levels of complexity and modeling capacity. We conduct over 1500 controlled experiments on a suite of ten synthetic signals, spanning five patch lengths and five forecast horizons under both clean and noisy conditions. Our analysis reveals consistent patterns across model families. To advance this landscape further, we introduce the Koopman-enhanced Transformer framework, Deep Koopformer, which integrates operator-theoretic latent state modeling to improve stability and interpretability. We demonstrate its efficacy on nonlinear and chaotic dynamical systems. Our results highlight Koopman based Transformer as a promising hybrid approach for robust, interpretable, and theoretically grounded time series forecasting in noisy and complex real-world conditions.