When Is Generalized Bayes Bayesian? A Decision-Theoretic Characterization of Loss-Based Updating

📅 2026-02-02
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study clarifies under what conditions loss-driven generalized Bayesian posterior updates admit a genuine Bayesian interpretation. Within a decision-theoretic framework, it distinguishes for the first time between “belief posteriors” and “decision posteriors.” Building on the Savage and Anscombe–Aumann axiomatic systems, exponential tilting, and sequential consistency assumptions—and leveraging variational representations and entropy regularization—the work demonstrates that generalized Bayesian updating coincides with standard Bayesian updating if and only if the loss function is the negative log-likelihood. The paper establishes necessary and sufficient conditions for generalized Bayes to be an optimal decision rule, reveals the necessity of nonlinear preferences for non-degenerate posteriors, and shows that marginal likelihoods and Bayes factors lack intrinsic evidential meaning within the decision-posterior framework.

Technology Category

Application Category

📝 Abstract
Loss-based updating, including generalized Bayes, Gibbs, and quasi-posteriors, replaces likelihoods by a user-chosen loss and produces a posterior-like distribution via exponential tilt. We give a decision-theoretic characterization that separates \emph{belief posteriors} -- conditional beliefs justified by the foundations of Savage and Anscombe-Aumann under a joint probability mode l-- from \emph{decision posteriors} -- randomized decision rules justified by preferences over decision rules. We make explicit that a loss-based posterior coincides with ordinary Bayes if and only if the loss is, up to scale and a data-only term, negative log-likelihood. We then show that generalized marginal likelihood is not evidence for decision posteriors, and Bayes factors are not well-defined without additional structure. In the decision posterior regime, non-degenerate posteriors require nonlinear preferences over decision rules. Under sequential coherence and separability, these lead to an entropy-penalized variational representation yielding generalized Bayes as the optimal rule.
Problem

Research questions and friction points this paper is trying to address.

generalized Bayes
loss-based updating
decision-theoretic characterization
belief posterior
decision posterior
Innovation

Methods, ideas, or system contributions that make the work stand out.

generalized Bayes
decision-theoretic characterization
loss-based updating
belief vs. decision posteriors
entropy-penalized variational representation
🔎 Similar Papers
No similar papers found.
Kenichiro McAlinn
Kenichiro McAlinn
Temple University
Bayesian StatisticsEconometrics
K
Kōsaku Takanashi
RIKEN Center for Advanced Intelligence Project