Towards Tsallis Fully Probabilistic Design

📅 2026-02-27
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses a fundamental limitation in fully probabilistic design: replacing the conventional Kullback–Leibler (KL) divergence with Tsallis divergence breaks the chain rule, thereby preventing backward recursive solutions. For the first time, Tsallis divergence is incorporated into this framework through the development of a fixed-point iteration algorithm. The authors rigorously establish the existence of a solution and construct a provably convergent numerical scheme for its computation. By leveraging the subadditivity property of Tsallis divergence, this approach transcends the constraints imposed by KL divergence, significantly enhancing the flexibility and applicability of Bayesian decision modeling.

Technology Category

Application Category

📝 Abstract
In this paper we present the foundations of Fully Probabilistic Design for the case when the Kullback-Leibler divergence is replaced by the Tsallis divergence. Because the standard chain rule is replaced by subadditivity, immediate backwards recursion is not available. However, by forming a fixed point iteration, we can establish a constructive proof of the existence of a solution to this problem, which also constitutes an algorithmic scheme that iteratively converges to this solution. This development can provide greater versatility in Bayesian Decision Making as far as adding flexibility to the problem formulation.
Problem

Research questions and friction points this paper is trying to address.

Tsallis divergence
Fully Probabilistic Design
Kullback-Leibler divergence
subadditivity
Bayesian Decision Making
Innovation

Methods, ideas, or system contributions that make the work stand out.

Tsallis divergence
Fully Probabilistic Design
fixed point iteration
Bayesian Decision Making
non-additive entropy
🔎 Similar Papers
No similar papers found.