Data Compression with Stochastic Codes

πŸ“… 2026-02-07
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
Traditional lossy compression is constrained by the rigid structure of quantization and entropy coding, making it difficult to balance efficiency and flexibility. This work proposes a novel compression framework based on relative entropy coding, which replaces conventional quantization and entropy coding modules with stochastic codes to establish a simpler yet more powerful paradigm for lossy compression. By integrating information theory with machine learning, the method systematically develops both the theoretical foundations and practical algorithms for relative entropy coding, effectively bridging the gap between theoretical performance and real-world applicability. Experimental results demonstrate its significant potential and superiority in image and signal compression tasks.

Technology Category

Application Category

πŸ“ Abstract
Machine learning has had a major impact on data compression over the last decade and inspired many new, exciting theoretical and applied questions. This paper describes one such direction -- relative entropy coding -- which focuses on constructing stochastic codes, primarily as an alternative to quantisation and entropy coding in lossy source coding. Our primary aim is to provide a broad overview of the topic, with an emphasis on the computational and practical aspects currently missing from the literature. Our goal is threefold: for the curious reader, we aim to provide an intuitive picture of the field and convince them that relative entropy coding is a simple yet exciting emerging field in data compression research. For a reader interested in applied research on lossy data compression, we provide an account of the most salient contemporary applications. Finally, for the reader who has heard of relative entropy coding but has never been quite sure what it is or how the algorithms fit together, we hope to illustrate how simple and elegant the underlying constructions are.
Problem

Research questions and friction points this paper is trying to address.

data compression
stochastic codes
relative entropy coding
lossy source coding
Innovation

Methods, ideas, or system contributions that make the work stand out.

relative entropy coding
stochastic codes
lossy compression
data compression
machine learning
πŸ”Ž Similar Papers
No similar papers found.