Mitigating optimistic bias in entropic risk estimation and optimization with an application to insurance

📅 2024-09-30
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Under small-sample regimes, the empirical entropy risk estimator systematically underestimates tail risk, leading to underpriced premiums and elevated underwriting risk for insurers. To address this, we propose a strongly asymptotically consistent two-stage bootstrap bias-correction method: first, a computationally efficient yet tail-sensitive dual-strategy Gaussian mixture model (GMM) is fitted to the data distribution; second, bootstrap-based estimation and correction of the asymptotic bias in entropy risk estimation are performed. This work establishes the first asymptotically unbiased correction framework for entropy risk measures and integrates the correction mechanism into distributionally robust optimization and insurance contract design, thereby uncovering the statistical origin of premium underestimation. Empirical results demonstrate substantial improvements in both bias reduction and robustness of risk estimation. In real-world applications, the method yields more actuarially sound homeowners’ insurance premiums—exhibiting upward adjustment and reduced variance—and significantly lowers out-of-sample insurer risk at statistically significant levels.

Technology Category

Application Category

📝 Abstract
The entropic risk measure is widely used in high-stakes decision making to account for tail risks associated with an uncertain loss. With limited data, the empirical entropic risk estimator, i.e. replacing the expectation in the entropic risk measure with a sample average, underestimates the true risk. To mitigate the bias in the empirical entropic risk estimator, we propose a strongly asymptotically consistent bootstrapping procedure. The first step of the procedure involves fitting a distribution to the data, whereas the second step estimates the bias of the empirical entropic risk estimator using bootstrapping, and corrects for it. Two methods are proposed to fit a Gaussian Mixture Model to the data, a computationally intensive one that fits the distribution of empirical entropic risk, and a simpler one with a component that fits the tail of the empirical distribution. As an application of our approach, we study distributionally robust entropic risk minimization problems with type-$infty$ Wasserstein ambiguity set and apply our bias correction to debias validation performance. Furthermore, we propose a distributionally robust optimization model for an insurance contract design problem that takes into account the correlations of losses across households. We show that choosing regularization parameters based on the cross validation methods can result in significantly higher out-of-sample risk for the insurer if the bias in validation performance is not corrected for. This improvement in performance can be explained from the observation that our methods suggest a higher (and more accurate) premium to homeowners.
Problem

Research questions and friction points this paper is trying to address.

Risk Underestimation
Entropy Risk Measurement
Insurance Premium Setting
Innovation

Methods, ideas, or system contributions that make the work stand out.

Entropy Risk Assessment
Pattern Recognition Techniques
Risk Minimization Strategies
🔎 Similar Papers
No similar papers found.