🤖 AI Summary
This work addresses the joint design of lossless compression and privacy for binary i.i.d. sources, under the requirement that decoding any single bit must reveal no information about the remaining bits—termed *perfect-privacy random access*. We propose an information-theoretic random coding framework that constructs conditionally independent codebooks to simultaneously guarantee lossless reconstruction and strong privacy. We prove that, for any compression rate strictly above the source entropy rate, both lossless compression and perfect-privacy random access are achievable—establishing the entropy rate as the fundamental limit (i.e., the minimum achievable rate under the privacy constraint). This is the first rigorous information-theoretic characterization demonstrating that perfect-privacy random access is feasible at rates arbitrarily close to, yet above, the entropy rate, thereby achieving optimal trade-offs between compression efficiency and formal privacy guarantees.
📝 Abstract
It is shown that an i.i.d. binary source sequence $X_1, ldots, X_n$ can be losslessly compressed at any rate above entropy such that the individual decoding of any $X_i$ reveals emph{no} information about the other bits ${X_j : j
eq i}$.