NeuTSFlow: Modeling Continuous Functions Behind Time Series Forecasting

📅 2025-07-13
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Traditional time-series forecasting treats observations as discrete sequences, overlooking their intrinsic nature as noisy samples from a family of continuous functions governed by a shared probability measure. This work introduces a functional-family perspective, modeling forecasting as a continuous transformation from historical to future function families. To this end, we parameterize the velocity field of a flow in an infinite-dimensional function space—first achieving such parameterization—by integrating neural operators with flow matching, thereby directly modeling functional-level dynamics. We further incorporate transfer learning between probability measures to infer latent continuous evolution paths from discrete observations. Experiments across diverse forecasting tasks demonstrate significant improvements in accuracy and robustness, validating the effectiveness and generalizability of the functional-family paradigm for capturing the inherent continuity of temporal processes.

Technology Category

Application Category

📝 Abstract
Time series forecasting is a fundamental task with broad applications, yet conventional methods often treat data as discrete sequences, overlooking their origin as noisy samples of continuous processes. Crucially, discrete noisy observations cannot uniquely determine a continuous function; instead, they correspond to a family of plausible functions. Mathematically, time series can be viewed as noisy observations of a continuous function family governed by a shared probability measure. Thus, the forecasting task can be framed as learning the transition from the historical function family to the future function family. This reframing introduces two key challenges: (1) How can we leverage discrete historical and future observations to learn the relationships between their underlying continuous functions? (2) How can we model the transition path in function space from the historical function family to the future function family? To address these challenges, we propose NeuTSFlow, a novel framework that leverages Neural Operators to facilitate flow matching for learning path of measure between historical and future function families. By parameterizing the velocity field of the flow in infinite-dimensional function spaces, NeuTSFlow moves beyond traditional methods that focus on dependencies at discrete points, directly modeling function-level features instead. Experiments on diverse forecasting tasks demonstrate NeuTSFlow's superior accuracy and robustness, validating the effectiveness of the function-family perspective.
Problem

Research questions and friction points this paper is trying to address.

Modeling continuous functions behind discrete noisy time series data
Learning transitions between historical and future function families
Addressing function-space path modeling for time series forecasting
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses Neural Operators for flow matching
Models velocity field in function spaces
Learns transition between function families
H
Huibo Xu
University of Science and Technology of China
L
Likang Wu
Tianjin University
Xianquan Wang
Xianquan Wang
University of Science and Technology of China
Data MiningUser ModelingCognitive Science
H
Haoning Dang
Xi’an Jiaotong University
Chun-Wun Cheng
Chun-Wun Cheng
PhD student, University of Cambridge
Implicit Deep LearningApplied MathematicsGenerative AI
A
Angelica I Aviles-Rivero
Cambridge University
Q
Qi Liu
University of Science and Technology of China