AI Foundation Model for Time Series with Innovations Representation

📅 2025-10-01
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing large language models lack causal structure and physical consistency, limiting their applicability to physics-governed time series modeling in real-time engineering monitoring and control. Method: We propose TS-GPT—the first generative pre-trained Transformer framework integrating the Wiener–Kallianpur–Rosenblatt innovation representation theory. TS-GPT maps time series into causal, orthogonal innovation processes, enabling probabilistic future sequence generation under strict causality constraints via conditional probability modeling and generative pre-training. Contribution/Results: The framework achieves interpretability, physical consistency, and strong generalization. Evaluated on nodal marginal price forecasting from a U.S. Independent System Operator, TS-GPT significantly outperforms state-of-the-art time series models. This demonstrates the effectiveness and advancement of theory-driven foundation models for engineering AI applications.

Technology Category

Application Category

📝 Abstract
This paper introduces an Artificial Intelligence (AI) foundation model for time series in engineering applications, where causal operations are required for real-time monitoring and control. Since engineering time series are governed by physical, rather than linguistic, laws, large-language-model-based AI foundation models may be ineffective or inefficient. Building on the classical innovations representation theory of Wiener, Kallianpur, and Rosenblatt, we propose Time Series GPT (TS-GPT) -- an innovations-representation-based Generative Pre-trained Transformer for engineering monitoring and control. As an example of foundation model adaptation, we consider Probabilistic Generative Forecasting, which produces future time series samples from conditional probability distributions given past realizations. We demonstrate the effectiveness of TS-GPT in forecasting real-time locational marginal prices using historical data from U.S. independent system operators.
Problem

Research questions and friction points this paper is trying to address.

Developing AI foundation model for engineering time series monitoring
Addressing limitations of language-based models in physical systems
Creating innovations-based transformer for probabilistic forecasting applications
Innovation

Methods, ideas, or system contributions that make the work stand out.

TS-GPT uses innovations representation for time series
It adapts foundation models for engineering monitoring
Generative forecasting produces conditional probability distributions
🔎 Similar Papers
No similar papers found.