Learning and Testing Convex Functions

📅 2025-11-14
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work studies learning and testing of high-dimensional real-valued convex functions under the standard Gaussian measure, overcoming limitations of prior convexity analysis confined to discrete domains (e.g., Hamming space). For Lipschitz-continuous convex functions, we establish the first learnability theory in continuous high-dimensional Gaussian space. We design a robust, two-sided tolerant tester and prove a lower bound on sample complexity in the statistical query model. Our adversarial learning algorithm achieves ε-error with sample complexity n^{O(1/ε²)}. We further construct both a consistent tester and a one-sided tester. Compared to existing results in discrete settings, our framework significantly improves model applicability—by operating under the natural Gaussian measure— theoretical completeness—via tight learnability and testability guarantees—and algorithmic efficiency—through polynomial sample dependence on 1/ε. The work thus advances the foundations of convex function learning and property testing in continuous, high-dimensional probabilistic spaces.

Technology Category

Application Category

📝 Abstract
We consider the problems of emph{learning} and emph{testing} real-valued convex functions over Gaussian space. Despite the extensive study of function convexity across mathematics, statistics, and computer science, its learnability and testability have largely been examined only in discrete or restricted settings -- typically with respect to the Hamming distance, which is ill-suited for real-valued functions. In contrast, we study these problems in high dimensions under the standard Gaussian measure, assuming sample access to the function and a mild smoothness condition, namely Lipschitzness. A smoothness assumption is natural and, in fact, necessary even in one dimension: without it, convexity cannot be inferred from finitely many samples. As our main results, we give: - Learning Convex Functions: An agnostic proper learning algorithm for Lipschitz convex functions that achieves error $varepsilon$ using $n^{O(1/varepsilon^2)}$ samples, together with a complementary lower bound of $n^{mathrm{poly}(1/varepsilon)}$ samples in the emph{correlational statistical query (CSQ)} model. - Testing Convex Functions: A tolerant (two-sided) tester for convexity of Lipschitz functions with the same sample complexity (as a corollary of our learning result), and a one-sided tester (which never rejects convex functions) using $O(sqrt{n}/varepsilon)^n$ samples.
Problem

Research questions and friction points this paper is trying to address.

Learning convex functions under Gaussian distribution with smoothness constraints
Testing convexity of Lipschitz functions using sample-based methods
Establishing sample complexity bounds for convex function learning and testing
Innovation

Methods, ideas, or system contributions that make the work stand out.

Learning Lipschitz convex functions under Gaussian measure
Testing convexity with tolerant two-sided sample complexity
Providing one-sided tester using exponential sample size
🔎 Similar Papers
No similar papers found.