A comparison between initialization strategies for the infinite hidden Markov model

📅 2025-12-03
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Infinite Hidden Markov Models (iHMMs) lack principled initialization strategies, and the commonly adopted uniform initialization is suboptimal. Method: We systematically evaluate finite-HMM–inspired initialization schemes in the infinite-state setting and propose a distance-based clustering initialization. Coupled with a hierarchical Dirichlet process prior and an enhanced beam sampler—integrating dynamic programming and slice sampling—our approach enables adaptive truncation and efficient Bayesian inference. Contribution/Results: Experiments on synthetic and real-world time-series data demonstrate that our initialization significantly improves both accuracy in detecting structural changes and convergence speed, outperforming uniform initialization and other baselines. This work provides the first empirically validated, initialization-optimization framework for practical iHMM deployment.

Technology Category

Application Category

📝 Abstract
Infinite hidden Markov models provide a flexible framework for modelling time series with structural changes and complex dynamics, without requiring the number of latent states to be specified in advance. This flexibility is achieved through the hierarchical Dirichlet process prior, while efficient Bayesian inference is enabled by the beam sampler, which combines dynamic programming with slice sampling to truncate the infinite state space adaptively. Despite extensive methodological developments, the role of initialization in this framework has received limited attention. This study addresses this gap by systematically evaluating initialization strategies commonly used for finite hidden Markov models and assessing their suitability in the infinite setting. Results from both simulated and real datasets show that distance-based clustering initializations consistently outperform model-based and uniform alternatives, the latter being the most widely adopted in the existing literature.
Problem

Research questions and friction points this paper is trying to address.

Evaluates initialization strategies for infinite hidden Markov models
Compares clustering, model-based, and uniform initialization methods
Identifies distance-based clustering as superior for infinite state inference
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses hierarchical Dirichlet process prior for flexible modeling
Applies beam sampler for efficient Bayesian inference
Evaluates distance-based clustering for initialization strategies
🔎 Similar Papers
No similar papers found.
F
Federico P. Cortese
Department of Economics, Management, and Quantitative Methods, University of Milan
Luca Rossini
Luca Rossini
Associate Professor in Statistics - University of Milan
Bayesian nonparametricsEconometricsEnergyForecastingCopula Models