🤖 AI Summary
This paper addresses the construction of optimal secure coded distributed computing schemes over arbitrary fields—including finite fields—overcoming the prior limitation to characteristic-zero fields. Methodologically, it integrates algebraic geometry (AG) codes, tensor decomposition, multivariate polynomial coding, and Hadamard–Schur products. The contributions are threefold: (1) the first extension of optimal secure coding to all fields; (2) a novel connection between finite-field functions and tensor rank, yielding a universal coding framework parameterized by recovery threshold; and (3) the construction of log-additive codes and evaluation codes satisfying degree constraints on algebraic curves. For $m imes m$ matrix multiplication, the scheme achieves recovery threshold $2m^{omega-1} + g$, where $g$ is the genus of the underlying AG code—matching the theoretical optimum. Furthermore, it proves that certain evaluation codes possess the log-additive property, substantially broadening both the applicability domain and efficiency frontier of secure distributed computation.
📝 Abstract
We construct optimal secure coded distributed schemes that extend the known optimal constructions over fields of characteristic 0 to all fields. A serendipitous result is that we can encode emph{all} functions over finite fields with a recovery threshold proportional to the complexity (tensor rank or multiplicative); this is due to the well-known result that all functions over a finite field can be represented as multivariate polynomials (or symmetric tensors). We get that a tensor of order $ell$ (or a multivariate polynomial of degree $ell$) can be computed in the faulty network of $N$ nodes setting within a factor of $ell$ and an additive term depending on the genus of a code with $N$ rational points and distance covering the number of faulty servers; in particular, we present a coding scheme for general matrix multiplication of two $m imes m $ matrices with a recovery threshold of $2 m^{omega } -1+g$ where $omega $ is the exponent of matrix multiplication which is optimal for coding schemes using AG codes. Moreover, we give sufficient conditions for which the Hadamard-Shur product of general linear codes gives a similar recovery threshold, which we call extit{log-additive codes}. Finally, we show that evaluation codes with a extit{curve degree} function (first defined in [Ben-Sasson et al. (STOC '13)]) that have well-behaved zero sets are log-additive.