🤖 AI Summary
To address the need for lightweight, general-purpose time-series foundation models supporting both deterministic and probabilistic forecasting, this paper proposes FLAME—a highly efficient architecture built upon an enhanced Legendre memory mechanism. Methodologically, FLAME introduces two distinct Legendre memory modules: translation-invariant LegT in the encoder and scale-aware LegS in the decoder, enabling explicit long-range dependency modeling and embedding data-driven inductive biases; its probabilistic prediction head employs normalizing flows for high-fidelity, generative distribution estimation. Empirically, FLAME achieves state-of-the-art performance on TSFM-Bench and ProbTS under zero-shot evaluation, while maintaining exceptional inference efficiency (<10M parameters), strong cross-dataset generalization, and robust long-horizon forecasting capability. To our knowledge, FLAME is the first general-purpose time-series foundation model to jointly leverage Legendre polynomial variants and normalizing flows—establishing a novel paradigm for expressive, parameter-efficient temporal representation learning.
📝 Abstract
In this work, we introduce FLAME, a family of extremely lightweight and capable Time Series Foundation Models, which support both deterministic and probabilistic forecasting via generative probabilistic modeling, thus ensuring both efficiency and robustness. FLAME utilizes the Legendre Memory for strong generalization capabilities. Through adapting variants of Legendre Memory, i.e., translated Legendre (LegT) and scaled Legendre (LegS), in the Encoding and Decoding phases, FLAME can effectively capture the inherent inductive bias within data and make efficient long-range inferences. To enhance the accuracy of probabilistic forecasting while keeping efficient, FLAME adopts a Normalization Flow based forecasting head, which can model the arbitrarily intricate distributions over the forecasting horizon in a generative manner. Comprehensive experiments on well-recognized benchmarks, including TSFM-Bench and ProbTS, demonstrate the consistent state-of-the-art zero-shot performance of FLAME on both deterministic and probabilistic forecasting tasks.