Optimal Approximation of Zonoids and Uniform Approximation by Shallow Neural Networks

📅 2023-07-28
🏛️ arXiv.org
📈 Citations: 6
Influential: 3
📄 PDF
🤖 AI Summary
This paper addresses two closely related approximation problems: (1) the optimal approximation rate of zonoids in $mathbb{R}^{d+1}$ by $n$ line segments under the Hausdorff distance; and (2) the optimal uniform-norm approximation order of shallow ReLU$^k$ neural networks in their variation spaces. Methodologically, the work integrates geometric analysis, convex body approximation theory, and variational space modeling of ReLU$^k$ networks. For problem (1), it completely closes the long-standing logarithmic gap between upper and lower bounds for $d = 2,3$, and establishes tight, dimension-agnostic approximation rates for all $d$. For problem (2), it significantly improves the known approximation orders for all $k geq 1$, and—crucially—achieves simultaneous uniform approximation of both the target function and its derivatives up to order $k$. The results constitute the strongest currently known dimension-independent uniform approximation guarantees for shallow ReLU$^k$ networks.
📝 Abstract
We study the following two related problems. The first is to determine to what error an arbitrary zonoid in $mathbb{R}^{d+1}$ can be approximated in the Hausdorff distance by a sum of $n$ line segments. The second is to determine optimal approximation rates in the uniform norm for shallow ReLU$^k$ neural networks on their variation spaces. The first of these problems has been solved for $d eq 2,3$, but when $d=2,3$ a logarithmic gap between the best upper and lower bounds remains. We close this gap, which completes the solution in all dimensions. For the second problem, our techniques significantly improve upon existing approximation rates when $kgeq 1$, and enable uniform approximation of both the target function and its derivatives.
Problem

Research questions and friction points this paper is trying to address.

Determine error bounds for approximating zonoids with line segments
Improve approximation rates for shallow ReLU^k neural networks
Enable uniform approximation of functions and their derivatives
Innovation

Methods, ideas, or system contributions that make the work stand out.

Close logarithmic gap in zonoid approximation
Improve shallow ReLU network approximation rates
Enable uniform approximation of derivatives
🔎 Similar Papers