🤖 AI Summary
To address key bottlenecks in long-horizon time series modeling—including length constraints, strong distributional assumptions, and limited generalization—this paper introduces Sundial, a foundational time series model. Methodologically, Sundial features a native architecture supporting arbitrary-length inputs and multimodal probabilistic forecasting, eliminating tokenization and parametric density assumptions. It introduces TimeFlow Loss, the first flow-matching loss for time series, mitigating mode collapse in generative modeling. Furthermore, Sundial employs a lightweight adapter-enhanced Transformer and is end-to-end pre-trained on TimeBench—a massive real+synthetic benchmark comprising one trillion time series points. Empirically, Sundial achieves state-of-the-art performance on both point and probabilistic forecasting benchmarks. It demonstrates exceptional zero-shot cross-domain generalization and strong scaling properties with model size.
📝 Abstract
We introduce Sundial, a family of native, flexible, and scalable time series foundation models. To predict the next-patch's distribution, we propose a TimeFlow Loss based on flow-matching, which facilitates native pre-training of Transformers on time series without discrete tokenization. Conditioned on arbitrary-length time series, our model is pre-trained without specifying any prior distribution and can generate multiple probable predictions, achieving flexibility in representation learning beyond using parametric densities. Towards time series foundation models, we leverage minimal but crucial adaptations of Transformers and curate TimeBench with 1 trillion time points, comprising mostly real-world datasets and synthetic data. By mitigating mode collapse through TimeFlow Loss, we pre-train a family of Sundial models on TimeBench, which exhibit unprecedented model capacity and generalization performance on zero-shot forecasting. In addition to presenting good scaling behavior, Sundial achieves new state-of-the-art on both point forecasting and probabilistic forecasting benchmarks. We believe that Sundial's pioneering generative paradigm will facilitate a wide variety of forecasting scenarios.