Testing noisy low-degree polynomials for sparsity

📅 2025-11-11
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper studies property testing of low-degree polynomial sparsity under noise: given noisy evaluations of a degree-$d$ polynomial at random points, determine whether it is $s$-sparse or $varepsilon$-far from all $T$-sparse low-degree polynomials. Methodologically, the work integrates multilinear polynomial analysis with Fourier tail estimation, extending Dinur et al.’s techniques—originally developed for the Boolean hypercube—to arbitrary finite-support distributions. The main contribution is the first exact characterization of constant-sample complexity for this problem: when $T geq mathrm{MSG}$, the query complexity is $O_{s,X,d}(1)$, independent of dimension $n$; whereas when $T < mathrm{MSG}-1$, a lower bound of $Omega(log n)$ holds, revealing the critical role of the monomial support gap ($mathrm{MSG}$) as a sharp threshold. These results break the sample-size bottleneck in high-dimensional sparsity testing and generalize noise-tolerant linearity testing to general low-degree polynomials.

Technology Category

Application Category

📝 Abstract
We consider the problem of testing whether an unknown low-degree polynomial $p$ over $mathbb{R}^n$ is sparse versus far from sparse, given access to noisy evaluations of the polynomial $p$ at emph{randomly chosen points}. This is a property-testing analogue of classical problems on learning sparse low-degree polynomials with noise, extending the work of Chen, De, and Servedio (2020) from noisy emph{linear} functions to general low-degree polynomials. Our main result gives a emph{precise characterization} of when sparsity testing for low-degree polynomials admits constant sample complexity independent of dimension, together with a matching constant-sample algorithm in that regime. For any mean-zero, variance-one finitely supported distribution $oldsymbol{X}$ over the reals, degree $d$, and any sparsity parameters $s leq T$, we define a computable function $mathrm{MSG}_{oldsymbol{X},d}(cdot)$, and: - For $T ge mathrm{MSG}_{oldsymbol{X},d}(s)$, we give an $O_{s,oldsymbol{X},d}(1)$-sample algorithm that distinguishes whether a multilinear degree-$d$ polynomial over $mathbb{R}^n$ is $s$-sparse versus $varepsilon$-far from $T$-sparse, given examples $(oldsymbol{x},, p(oldsymbol{x}) + mathrm{noise})_{oldsymbol{x} sim oldsymbol{X}^{otimes n}}$. Crucially, the sample complexity is emph{completely independent} of the ambient dimension $n$. - For $T leq mathrm{MSG}_{oldsymbol{X},d}(s) - 1$, we show that even without noise, any algorithm given samples $(oldsymbol{x},p(oldsymbol{x}))_{oldsymbol{x} sim oldsymbol{X}^{otimes n}}$ must use $Omega_{oldsymbol{X},d,s}(log n)$ examples. Our techniques employ a generalization of the results of Dinur et al. (2007) on the Fourier tails of bounded functions over ${0,1}^n$ to a broad range of finitely supported distributions, which may be of independent interest.
Problem

Research questions and friction points this paper is trying to address.

Testing sparsity of noisy low-degree polynomials
Characterizing constant sample complexity conditions
Distinguishing sparse versus far-from-sparse polynomials
Innovation

Methods, ideas, or system contributions that make the work stand out.

Testing sparsity of noisy polynomials via random samples
Using computable MSG function to determine sample complexity
Extending Fourier analysis techniques to finitely supported distributions
🔎 Similar Papers
2024-01-28AAAI Conference on Artificial IntelligenceCitations: 1