Rate-Distortion-Perception Tradeoff Based on the Conditional-Distribution Perception Measure

📅 2024-01-22
🏛️ IEEE Transactions on Information Theory
📈 Citations: 11
Influential: 0
📄 PDF
🤖 AI Summary
This work precisely characterizes the rate–distortion–perception (RDP) trade-off without shared randomness. For discrete memoryless sources, it establishes the first single-letter RDP function formulation based on a conditional-distribution-awareness metric, revealing that zero perception loss is equivalent to an edge-metric constraint and incurs only a 3 dB excess distortion penalty. For continuous sources under squared-error distortion and Wasserstein perception metrics, it derives closed-form RDP solutions for Gaussian and mixture-of-Gaussian vector sources, and proposes a reverse water-filling–type optimal encoding strategy. Methodologically, the approach integrates posterior reference mapping, conditional-distribution divergence modeling, and noise-injection decoding. The results unify discrete and continuous source settings, yielding tight, constructive, single-letter characterizations of the RDP function with provably tight theoretical bounds.

Technology Category

Application Category

📝 Abstract
This paper studies the rate-distortion-perception (RDP) tradeoff for a memoryless source model in the asymptotic limit of large block-lengths. The perception measure is based on a divergence between the distributions of the source and reconstruction sequences conditioned on the encoder output, first proposed by Mentzer et al. We consider the case when there is no shared randomness between the encoder and the decoder and derive a single-letter characterization of the RDP function, for the case of discrete memoryless sources. This is in contrast to the marginal-distribution metric case (introduced by Blau and Michaeli), whose RDP characterization remains open when there is no shared randomness. The achievability scheme is based on lossy source coding with a posterior reference map. For the case of continuous valued sources under the squared error distortion measure and the squared quadratic Wasserstein perception measure, we also derive a single-letter characterization and show that the decoder can be restricted to a noise-adding mechanism. Interestingly, the RDP function characterized for the case of zero perception loss coincides with that of the marginal metric, and further zero perception loss can be achieved with a 3-dB penalty in minimum distortion. Finally we specialize to the case of Gaussian sources, and derive the RDP function for Gaussian vector case and propose a reverse water-filling type solution. We also partially characterize the RDP function for a mixture of Gaussian vector sources.
Problem

Research questions and friction points this paper is trying to address.

Studies RDP tradeoff for memoryless sources with conditional-distribution perception.
Derives single-letter RDP function for discrete sources without shared randomness.
Characterizes RDP for Gaussian sources with reverse water-filling solution.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Conditional-distribution perception measure for RDP tradeoff
Lossy source coding with posterior reference map
Noise-adding decoder for continuous valued sources
🔎 Similar Papers
2024-06-26IEEE Journal on Selected Areas in Information TheoryCitations: 3
Sadaf Salehkalaibar
Sadaf Salehkalaibar
Assistant Professor at University of Manitoba
Information theoryMachine learningSecurity
J
Jun Chen
Department of Electrical and Computer Engineering at McMaster University, Hamilton, ON L8S 4K1, Canada
A
A. Khisti
Department of Electrical and Computer Engineering at the University of Toronto, Toronto, M5S 3G4, Canada
W
Wei Yu
Department of Electrical and Computer Engineering at the University of Toronto, Toronto, M5S 3G4, Canada