Higher Order Approximation Rates for ReLU CNNs in Korobov Spaces

📅 2025-01-20
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper investigates the $L_p$ approximation capability of ReLU convolutional neural networks (CNNs) for $m$-th order mixed-smooth functions in Korobov spaces. To address the curse of dimensionality and the classical bottleneck of second-order convergence rates in high-dimensional function approximation, we propose a CNN construction based on high-order sparse grid bases. We establish, for the first time, the exact representation of high-order sparse grid functions by ReLU CNNs. By leveraging mixed derivative regularity characterizations and refined $L_p$ error analysis, we prove that under an $(m+1)$-th order mixed smoothness assumption, deep CNNs achieve an approximation rate of $O(L^{-(m+1)} log^alpha L)$, where $L$ denotes the number of parameters—thereby surpassing the classical second-order limit. This result demonstrates that network depth is a critical mechanism enabling high-order approximation and provides theoretical justification for how deep learning mitigates the curse of dimensionality.

Technology Category

Application Category

📝 Abstract
This paper investigates the $L_p$ approximation error for higher order Korobov functions using deep convolutional neural networks (CNNs) with ReLU activation. For target functions having a mixed derivative of order m+1 in each direction, we improve classical approximation rate of second order to (m+1)-th order (modulo a logarithmic factor) in terms of the depth of CNNs. The key ingredient in our analysis is approximate representation of high-order sparse grid basis functions by CNNs. The results suggest that higher order expressivity of CNNs does not severely suffer from the curse of dimensionality.
Problem

Research questions and friction points this paper is trying to address.

ReLU Convolutional Neural Networks
Korobov Functions
High-dimensional Approximation
Innovation

Methods, ideas, or system contributions that make the work stand out.

ReLU Convolutional Neural Networks
Korobov Space Function Approximation
Sparse Grid Basis Function Representation
🔎 Similar Papers
No similar papers found.