List-Recovery of Random Linear Codes over Small Fields

📅 2025-05-09
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work investigates list recovery of random linear codes over small finite fields under erasure and adversarial error noise, focusing on the low-rate regime approaching channel capacity. Specifically, it characterizes the dependence of the output list size $L$ on the capacity gap $varepsilon$, input list size $ell$, and alphabet size $q$. Using probabilistic methods and coding-theoretic analysis—including random linear code ensembles and combinatorial bounds—the paper breaks the classical Zyablov–Pinsker bound for the first time. For erasures over prime fields and errors over arbitrary finite fields, it establishes $L = O(1/varepsilon)$, with explicit upper bounds $L le C_1/varepsilon$ (erasure) and $L le C_2/varepsilon$ (error). These bounds strictly improve upon prior results when $q le 2^{(1/varepsilon)^c}$, and the constants $C_1, C_2$ are quantitatively estimated. The results demonstrate optimal asymptotic performance of linear codes in this critical parameter regime.

Technology Category

Application Category

📝 Abstract
We study list-recoverability of random linear codes over small fields, both from errors and from erasures. We consider codes of rate $epsilon$-close to capacity, and aim to bound the dependence of the output list size $L$ on $epsilon$, the input list size $ell$, and the alphabet size $q$. Prior to our work, the best upper bound was $L = q^{O(ell/epsilon)}$ (Zyablov and Pinsker, Prob. Per. Inf. 1981). Previous work has identified cases in which linear codes provably perform worse than non-linear codes with respect to list-recovery. While there exist non-linear codes that achieve $L=O(ell/epsilon)$, we know that $L ge ell^{Omega(1/epsilon)}$ is necessary for list recovery from erasures over fields of small characteristic, and for list recovery from errors over large alphabets. We show that in other relevant regimes there is no significant price to pay for linearity, in the sense that we get the correct dependence on the gap-to-capacity $epsilon$ and go beyond the Zyablov-Pinsker bound for the first time. Specifically, when $q$ is constant and $epsilon$ approaches zero: - For list-recovery from erasures over prime fields, we show that $L leq C_1/epsilon$. By prior work, such a result cannot be obtained for low-characteristic fields. - For list-recovery from errors over arbitrary fields, we prove that $L leq C_2/epsilon$. Above, $C_1$ and $C_2$ depend on the decoding radius, input list size, and field size. We provide concrete bounds on the constants above, and the upper bounds on $L$ improve upon the Zyablov-Pinsker bound whenever $qleq 2^{(1/epsilon)^c}$ for some small universal constant $c>0$.
Problem

Research questions and friction points this paper is trying to address.

Study list-recovery of random linear codes over small fields
Bound list size L dependence on ε, ℓ, and field size q
Compare linear vs non-linear codes in list-recovery performance
Innovation

Methods, ideas, or system contributions that make the work stand out.

List-recovery of random linear codes over small fields
Improved upper bounds for list size L
Optimal dependence on gap-to-capacity ε
🔎 Similar Papers
No similar papers found.