🤖 AI Summary
Traditional time-series forecasting treats observations as discrete sequences, overlooking their intrinsic nature as noisy samples from a family of continuous functions governed by a shared probability measure. This work introduces a functional-family perspective, modeling forecasting as a continuous transformation from historical to future function families. To this end, we parameterize the velocity field of a flow in an infinite-dimensional function space—first achieving such parameterization—by integrating neural operators with flow matching, thereby directly modeling functional-level dynamics. We further incorporate transfer learning between probability measures to infer latent continuous evolution paths from discrete observations. Experiments across diverse forecasting tasks demonstrate significant improvements in accuracy and robustness, validating the effectiveness and generalizability of the functional-family paradigm for capturing the inherent continuity of temporal processes.
📝 Abstract
Time series forecasting is a fundamental task with broad applications, yet conventional methods often treat data as discrete sequences, overlooking their origin as noisy samples of continuous processes. Crucially, discrete noisy observations cannot uniquely determine a continuous function; instead, they correspond to a family of plausible functions. Mathematically, time series can be viewed as noisy observations of a continuous function family governed by a shared probability measure. Thus, the forecasting task can be framed as learning the transition from the historical function family to the future function family. This reframing introduces two key challenges: (1) How can we leverage discrete historical and future observations to learn the relationships between their underlying continuous functions? (2) How can we model the transition path in function space from the historical function family to the future function family? To address these challenges, we propose NeuTSFlow, a novel framework that leverages Neural Operators to facilitate flow matching for learning path of measure between historical and future function families. By parameterizing the velocity field of the flow in infinite-dimensional function spaces, NeuTSFlow moves beyond traditional methods that focus on dependencies at discrete points, directly modeling function-level features instead. Experiments on diverse forecasting tasks demonstrate NeuTSFlow's superior accuracy and robustness, validating the effectiveness of the function-family perspective.