🤖 AI Summary
This study investigates the rate-distortion function of vector Gaussian sources under component-wise independent distortion constraints, with a focus on characterizing the fundamental limits of compression when the semidefinite condition (SDC) fails to hold. By integrating information-theoretic analysis, rate-distortion theory, and semidefinite programming within an analytically tractable two-type correlation (2TC) covariance model, this work provides the first systematic characterization of optimal compression performance in the absence of SDC. Key contributions include proving that the probability of SDC being satisfied decays exponentially with source dimension, establishing the necessity of low-dimensional reconstruction and deriving a tight upper bound on the optimal reconstruction dimension, and obtaining an explicit rate-distortion expression that captures the influence of source correlation structure on compression gains. Furthermore, the precise covariance conditions required to achieve Hadamard-compressible rates are identified.
📝 Abstract
This paper investigates the joint compression problem of a vector Gaussian source, where an individual distortion constraint is imposed on each source component. It is known that the rate-distortion function (RDF) is lower-bounded by the rate derived from the Hadamard inequality, which becomes exact when the semidefinite condition (SDC) holds. However, existing works often overlook the case where the SDC is not satisfied. Moreover, even when the SDC holds, a quantitative characterization of how correlations enable more efficient compression is lacking. In this work, we refine the results when the SDC is satisfied and derive new theoretical results when the SDC is not satisfied, thereby establishing theoretical limits for practical source compression with correlations. Specifically, we examine the properties of optimal source reconstruction and provide upper bounds on its dimension, showing that lower-dimensional reconstructions are essential for efficient compression when the SDC does not hold. Within a scalable two-type correlation (2TC) covariance framework, we prove that the probability of satisfying the SDC decays exponentially with source length, emphasizing the importance of exploring theoretical limits when the SDC is not met. Additional, we determine the component-wise correlations that a vector source should possess to achieve the Hadamard compression rate, revealing the trade-off between distortion constraints and correlations. More importantly, by deriving an explicit RDF with correlations incorporated, we quantitatively characterize the gain in compression efficiency achieved by fully leveraging source correlations.