🤖 AI Summary
This work investigates the feasibility of property testing for sparse halfspaces—i.e., linear threshold functions—over the reals under the standard Gaussian distribution with respect to relative error. By integrating tools from Hermite analysis, Gaussian isoperimetric inequalities, noise sensitivity, and geometric properties of surface area, the authors establish, for the first time, that such testing can be achieved with sublinear query complexity. This result breaks through the Ω(log n) query lower bound known in the Boolean setting and significantly improves upon the sample complexity required for learning these functions, thereby enabling highly efficient property testing with substantially reduced query overhead.
📝 Abstract
The relative-error property testing model was introduced in [CDHLNSY24] to facilitate the study of property testing for "sparse" Boolean-valued functions, i.e. ones for which only a small fraction of all input assignments satisfy the function. In this framework, the distance from the unknown target function $f$ that is being tested to a function $g$ is defined as $\mathrm{Vol}(f \mathop{\triangle} g)/\mathrm{Vol}(f)$, where the numerator is the fraction of inputs on which $f$ and $g$ disagree and the denominator is the fraction of inputs that satisfy $f$.
Recent work [CDHNSY26] has shown that over the Boolean domain $\{0,1\}^n$, any relative-error testing algorithm for the fundamental class of halfspaces (i.e. linear threshold functions) must make $Ω(\log n)$ oracle calls. In this paper we complement the [CDHNSY26] lower bound by showing that halfspaces can be relative-error tested over $\mathbb{R}^n$ under the standard $N(0,I_n)$ Gaussian distribution using a sublinear number of oracle calls -- in particular, substantially fewer than would be required for learning. Our results use a wide range of tools including Hermite analysis, Gaussian isoperimetric inequalities, and geometric results on noise sensitivity and surface area.