Wasserstein complexity penalization priors: a new class of penalizing complexity priors

📅 2023-12-07
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Traditional penalized complexity (PC) priors rely on the Kullback–Leibler (KL) divergence to quantify model complexity, but KL divergence diverges when the base and target models are incompatible—rendering the prior undefined. Existing workarounds often compromise interpretability. Method: We propose the Wasserstein Complexity Penalty (WCP) prior, which replaces KL divergence with the Wasserstein distance—thereby inherently avoiding infinite penalties—and introduces a base measure that decouples parameter structure from base model dependence, enabling joint multi-parameter modeling and closed-form construction. Contribution/Results: This is the first systematic integration of the Wasserstein distance into the PC framework, yielding interpretable and computationally tractable complexity control across multivariate settings. We provide a complete R implementation, significantly extending both the applicability and practical utility of PC priors.
📝 Abstract
Penalizing complexity (PC) priors provide a principled framework for reducing model complexity by penalizing the Kullback--Leibler Divergence (KLD) between a ``simple'' base model and a more complex model. However, constructing priors by penalizing the KLD becomes impossible in many cases because the KLD is infinite, and alternative principles often lose interpretability in terms of KLD. We propose a new class of priors, the Wasserstein complexity penalization (WCP) priors, which replace the KLD with the Wasserstein distance in the PC prior framework. WCP priors avoid the issue of infinite model distances and retain interpretability by adhering to adjusted principles. Additionally, we introduce the concept of base measures, removing the parameter dependency on the base model, and extend the framework to joint WCP priors for multiple parameters. These priors can be constructed analytically and we have both analytical and numerical implementations in R programming language. We demonstrate their use in previous PC prior applications and as well as new multivariate settings.
Problem

Research questions and friction points this paper is trying to address.

Replaces KLD with Wasserstein distance for finite model distances
Introduces base measures to remove parameter dependency
Extends framework to joint priors for multiple parameters
Innovation

Methods, ideas, or system contributions that make the work stand out.

Replace KLD with Wasserstein distance
Introduce base measures for parameter independence
Extend to joint priors for multiple parameters
🔎 Similar Papers
No similar papers found.