🤖 AI Summary
Modeling nonstationary time series with high-order temporal dependencies—particularly prevalent in financial applications—remains challenging due to structural instability and complex, latent state dynamics.
Method: This paper proposes a Bayesian context-tree-based state-space mixture framework. It is the first to embed variable-order Markov structures (i.e., context trees) into a Bayesian state-space model, enabling joint optimization of discrete state identification, dynamic structure learning, and parameter inference. The framework flexibly integrates arbitrary base predictors (e.g., linear models or neural networks), balancing interpretability and modeling expressiveness. Inference leverages context-tree weighting, variational autoencoding approximations, and recursive Bayesian filtering.
Results: Evaluated on diverse real-world financial and macroeconomic time series, the method outperforms higher-order HMMs and RNN baselines in long-horizon forecasting accuracy, while offering strong structural interpretability and superior computational efficiency.