Universal, sample-optimal algorithms for recovery of anisotropic functions from i.i.d. samples

📅 2026-04-08
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the problem of sample-efficient recovery of high-dimensional anisotropic functions with unknown smoothness structures by recasting it as a sparse recovery task in the Fourier domain. Building upon compressive sensing, the authors propose the first non-adaptive, universal algorithm applicable to a broad range of anisotropic Sobolev classes. The method achieves near-optimal sample complexity that is independent of the ambient dimension and incurs only polylogarithmic factors. Theoretical analysis demonstrates that any linear recovery algorithm—even adaptive ones—necessarily suffers from suboptimal, dimension-dependent performance, thereby establishing the essential role of nonlinear mechanisms in achieving optimal reconstruction guarantees.
📝 Abstract
A key problem in approximation theory is the recovery of high-dimensional functions from samples. In many cases, the functions of interest exhibit anisotropic smoothness, and, in many practical settings, the nature of this anisotropy may be unknown a priori. Therefore, an important question involves the development of universal algorithms, namely, algorithms that simultaneously achieve optimal or near-optimal rates of convergence across a range of different anisotropic smoothness classes. In this work, we consider universal approximation of periodic functions that belong to anisotropic Sobolev spaces and anisotropic dominating mixed smoothness Sobolev spaces. Our first result is the construction of a universal algorithm. This recasts function recovery as a sparse recovery problem for Fourier coefficients and then exploits compressed sensing to yield the desired approximation rates. Note that this algorithm is nonadaptive, as it does not seek to learn the anisotropic smoothness of the target function. We then demonstrate optimality of this algorithm up to a dimension-independent polylogarithmic factor. We do this by presenting a lower bound for the adaptive $m$-width for the unit balls of such function classes. Finally, we demonstrate the necessity of nonlinear algorithms. We show that universal linear algorithms can achieve rates that are at best suboptimal by a dimension-dependent polylogarithmic factor. In other words, they suffer from a curse of dimensionality in the rate -- a phenomenon which justifies the necessity of nonlinear algorithms for universal recovery.
Problem

Research questions and friction points this paper is trying to address.

anisotropic functions
function recovery
universal algorithms
sample complexity
high-dimensional approximation
Innovation

Methods, ideas, or system contributions that make the work stand out.

universal algorithm
anisotropic smoothness
compressed sensing
nonlinear approximation
sample complexity
🔎 Similar Papers
No similar papers found.