Dispersion of Gaussian Sources with Memory and an Extension to Abstract Sources

📅 2026-02-09
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study addresses the minimum achievable compression rate for finite blocklengths under a distortion constraint and a bounded excess-distortion probability, focusing on Gaussian sources with memory and general non-i.i.d. sources. By extending existing dispersion theory—previously limited to i.i.d. sources—to sources with memory, and by constructing typical sets via product measures with point masses, the authors derive a second-order asymptotic expansion of the minimum achievable rate: $R(n,d,\varepsilon) = \mathbb{R}_n(d) + \sqrt{\mathbb{V}_n(d)/n}\,Q^{-1}(\varepsilon) + O(\log n / n)$. This result leverages second-order asymptotic analysis, an expansion of the inverse Q-function, and typicality arguments for sums of non-i.i.d. random variables. The framework not only improves known performance bounds for scalar Gaussian–Markov sources but also provides a unified characterization of rate–distortion trade-offs for general sources with memory.

Technology Category

Application Category

📝 Abstract
We consider finite blocklength lossy compression of information sources whose components are independent but non-identically distributed. Crucially, Gaussian sources with memory and quadratic distortion can be cast in this form. We show that under the operational constraint of exceeding distortion $d$ with probability at most $\epsilon$, the minimum achievable rate at blocklength $n$ satisfies $R(n, d, \epsilon)=\mathbb{R}_n(d)+\sqrt{\frac{\mathbb{V}_n(d)}{n}}Q^{-1}(\epsilon)+O \left(\frac{\log n}{n}\right)$, where $Q^{-1}(\cdot)$ is the inverse $Q$-function, while $\mathbb{R}_n(d)$ and $\mathbb{V}_n(d)$ are fundamental characteristics of the source computed using its $n$-letter joint distribution and the distortion measure, called the $n$th-order informational rate-distortion function and the source dispersion, respectively. Our result generalizes the existing dispersion result for abstract sources with i.i.d. components. It also sharpens and extends the only known dispersion result for a source with memory, namely, the scalar Gauss-Markov source. The key novel technical tool in our analysis is the point-mass product proxy measure, which enables the construction of typical sets. This proxy generalizes the empirical distribution beyond the i.i.d. setting by preserving additivity across coordinates and facilitating a typicality analysis for sums of independent, non-identical terms.
Problem

Research questions and friction points this paper is trying to address.

Gaussian sources with memory
finite blocklength
lossy compression
source dispersion
non-i.i.d. sources
Innovation

Methods, ideas, or system contributions that make the work stand out.

source dispersion
non-i.i.d. sources
point-mass product proxy measure
finite blocklength compression
Gaussian sources with memory
🔎 Similar Papers
No similar papers found.