🤖 AI Summary
This work investigates the list-recoverability capacity of linear codes, specifically whether random linear codes and randomly punctured Reed–Solomon codes achieve the optimal list-recovery rate. The paper establishes, for the first time, that random linear codes are $(1-R-varepsilon,ell,L)$-list-recoverable with constant list size $L = (ell/varepsilon)^{O(ell/varepsilon)}$, thereby approaching the information-theoretic capacity bound. Concurrently, it proves a tight lower bound $L ge ell^{Omega(1/varepsilon)}$, exposing an inherent limitation imposed by linearity and nearly matching the upper bound. Technically, the analysis integrates the Zyablov–Pinsker argument, subspace intersection bounds due to Kopparty et al., and a Chern–Zhang–style lower-bound technique. These results provide both theoretical guarantees and structural insights for designing efficient, fault-tolerant coding schemes.
📝 Abstract
We prove several results on linear codes achieving list-recovery capacity. We show that random linear codes achieve list-recovery capacity with constant output list size (independent of the alphabet size and length). That is, over alphabets of size at least $ell^{Omega(1/varepsilon)}$, random linear codes of rate $R$ are $(1-R-varepsilon, ell, (ell/varepsilon)^{O(ell/varepsilon)})$-list-recoverable for all $Rin(0,1)$ and $ell$. Together with a result of Levi, Mosheiff, and Shagrithaya, this implies that randomly punctured Reed-Solomon codes also achieve list-recovery capacity. We also prove that our output list size is near-optimal among all linear codes: all $(1-R-varepsilon, ell, L)$-list-recoverable linear codes must have $Lge ell^{Omega(1/varepsilon)}$. Our simple upper bound combines the Zyablov-Pinsker argument with recent bounds from Kopparty, Ron-Zewi, Saraf, Wootters, and Tamo on the maximum intersection of a"list-recovery ball"and a low-dimensional subspace with large distance. Our lower bound is inspired by a recent lower bound of Chen and Zhang.