🤖 AI Summary
This paper investigates the rate–distortion–perception function (RDPF) trade-off for Gaussian sources under mean-square error distortion and α-divergence perception constraints. To characterize this information-theoretic quantity in goal-oriented compression, we derive a parametric closed-form solution for the jointly Gaussian RDPF and establish its equivalence to finding real roots of a reduced exponential polynomial. Leveraging the monotonicity of the α-divergence, we partition the feasible region into disjoint intervals containing at most one real root, enabling efficient binary search for numerical computation. The approach integrates information theory, convex optimization, Gaussian modeling, and α-divergence theory, with rigorous numerical validation. We obtain, for the first time, a computable upper bound on the RDPF; numerical experiments confirm the theoretical predictions. Moreover, our framework unifies existing special cases—e.g., recovering the KL-divergence-based RDPF as α → 1.
📝 Abstract
The problem of estimating the information rate distortion perception function (RDPF), which is a relevant information-theoretic quantity in goal-oriented lossy compression and semantic information reconstruction, is investigated here. Specifically, we study the RDPF tradeoff for Gaussian sources subject to a mean-squared error (MSE) distortion and a perception measure that belongs to the family of α divergences. Assuming a jointly Gaussian RDPF, which forms a convex optimization problem, we characterize an upper bound for which we find a parametric solution. We show that evaluating the optimal parameters of this parametric solution is equivalent to finding the roots of a reduced exponential polynomial of degree α. Additionally, we determine which disjoint sets contain each root, which enables us to evaluate them numerically using the well-known bisection method. Finally, we validate our analytical findings with numerical results and establish connections with existing results.