Bayesian Mixture Models for Heterogeneous Extremes

📅 2025-09-18
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Traditional generalized extreme value (GEV) models assume homogeneity among block maxima, limiting their ability to characterize heterogeneous extreme events from multiple sources and leading to biased risk estimates. To address this, we propose a Bayesian nonparametric mixture model that employs a Dirichlet process prior to adaptively infer an unknown number of heterogeneous extreme-value clusters, each assigned its own GEV distribution. Full Bayesian inference is conducted via Markov chain Monte Carlo (MCMC). This work represents the first systematic integration of Dirichlet process mixture models into extreme value analysis, simultaneously preserving the theoretical rigor of extreme value theory and enabling flexible, data-driven cluster structure identification—without pre-specifying the number of groups. Simulation studies and empirical applications demonstrate that the method accurately uncovers heterogeneity in extreme behavior, substantially improving the accuracy of risk measures (e.g., return levels, failure probabilities) and enhancing model robustness.

Technology Category

Application Category

📝 Abstract
The conventional use of the Generalized Extreme Value (GEV) distribution to model block maxima may be inappropriate when extremes are actually structured into multiple heterogeneous groups. This can result in inaccurate risk estimation of extreme events based on return levels. In this work, we propose a novel approach for describing the behavior of extreme values in the presence of such heterogeneity. Rather than defaulting to the GEV distribution simply because it arises as a theoretical limit, we show that alternative block-maxima-based models can also align with the extremal types theorem while providing improved robustness and flexibility in practice. Our formulation leads us to a mixture model that has a Bayesian nonparametric interpretation as a Dirichlet process mixture of GEV distributions. The use of an infinite number of components enables the characterization of every possible block behavior, while at the same time defining similarities between observations based on their extremal behavior. By employing a Dirichlet process prior on the mixing measure, we can capture the complex structure of the data without the need to pre-specify the number of mixture components. The application of the proposed model is illustrated using both simulated and real-world data.
Problem

Research questions and friction points this paper is trying to address.

Modeling heterogeneous extremes with Bayesian mixture distributions
Addressing inaccurate risk estimation from conventional GEV methods
Capturing complex extreme data structure without pre-specified components
Innovation

Methods, ideas, or system contributions that make the work stand out.

Bayesian nonparametric Dirichlet process mixture
Infinite GEV components for block behavior
No pre-specified mixture component number