Efficient Transformed Gaussian Process State-Space Models for Non-Stationary High-Dimensional Dynamical Systems

📅 2025-03-24
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the low computational efficiency, poor scalability, and overly restrictive stationarity assumptions of Gaussian Process State-Space Models (GPSSMs) in modeling high-dimensional nonstationary dynamical systems, this paper proposes the Efficient Transformation GPSSM (ETGPSSM). Methodologically, it introduces— for the first time—the coupling of normalizing flows with shared Gaussian processes to construct a flexible, nonstationary latent dynamics prior; develops a generative-process-based variational inference framework; and integrates the Ensemble Kalman Filter (EnKF) to circumvent the absence of closed-form solutions in latent-transformed GPs. Furthermore, Bayesian neural networks are incorporated to enhance representational capacity. Experiments on synthetic and real-world benchmarks demonstrate that ETGPSSM significantly outperforms existing GPSSMs and neural baselines in both high-dimensional state estimation and time-series forecasting, achieving superior accuracy while maintaining high computational efficiency.

Technology Category

Application Category

📝 Abstract
Gaussian process state-space models (GPSSMs) have emerged as a powerful framework for modeling dynamical systems, offering interpretable uncertainty quantification and inherent regularization. However, existing GPSSMs face significant challenges in handling high-dimensional, non-stationary systems due to computational inefficiencies, limited scalability, and restrictive stationarity assumptions. In this paper, we propose an efficient transformed Gaussian process state-space model (ETGPSSM) to address these limitations. Our approach leverages a single shared Gaussian process (GP) combined with normalizing flows and Bayesian neural networks, enabling efficient modeling of complex, high-dimensional state transitions while preserving scalability. To address the lack of closed-form expressions for the implicit process in the transformed GP, we follow its generative process and introduce an efficient variational inference algorithm, aided by the ensemble Kalman filter (EnKF), to enable computationally tractable learning and inference. Extensive empirical evaluations on synthetic and real-world datasets demonstrate the superior performance of our ETGPSSM in system dynamics learning, high-dimensional state estimation, and time-series forecasting, outperforming existing GPSSMs and neural network-based methods in both accuracy and computational efficiency.
Problem

Research questions and friction points this paper is trying to address.

Modeling high-dimensional non-stationary dynamical systems efficiently
Overcoming computational inefficiencies in Gaussian process state-space models
Enabling scalable inference for complex state transitions
Innovation

Methods, ideas, or system contributions that make the work stand out.

Combines shared GP with normalizing flows
Uses efficient variational inference algorithm
Integrates ensemble Kalman filter for learning
🔎 Similar Papers
No similar papers found.
Zhidi Lin
Zhidi Lin
The University of Hong Kong
Approximate Bayesian InferenceDynamical SystemsGaussian ProcessBayesian Signal Processing
Y
Ying Li
Department of Statistics and Actuarial Science, University of Hong Kong, Hong Kong, SAR, China
F
Feng Yin
School of Science and Engineering, The Chinese University of Hong Kong, Shenzhen, Shenzhen 518172, China
J
Juan Maronas
Machine Learning Group, Universidad Autonoma de Madrid, and Department of Quantitative Methods at CUNEF University
A
Alexandre H. Thi'ery
Department of Statistics and Data Science, National University of Singapore, Singapore 117546