High-dimensional classification problems with Barron regular boundaries under margin conditions

📅 2024-12-10
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
ReLU networks suffer from poor approximation capability and slow learning rates for high-dimensional, discontinuous classification tasks. Method: This work investigates the approximability and learnability of Barron-type decision boundaries under a strong margin condition, integrating Barron function space theory, expressive power analysis of ReLU networks, and empirical risk minimization. Contribution/Results: We establish, for the first time, that three-layer ReLU networks achieve a near-optimal learning rate of (O(n^{-1})) for high-dimensional discontinuous classifiers under the strong margin assumption—breaking the curse of dimensionality. Concurrently, we derive the optimal (O(n^{-1})) approximation rate and fast learning bound. Experiments on high-dimensional binary classification tasks (e.g., MNIST) validate the theoretical findings: our approach significantly outperforms classical VC-dimension-based bounds in 784-dimensional image classification, demonstrating both theoretical optimality and practical efficacy.

Technology Category

Application Category

📝 Abstract
We prove that a classifier with a Barron-regular decision boundary can be approximated with a rate of high polynomial degree by ReLU neural networks with three hidden layers when a margin condition is assumed. In particular, for strong margin conditions, high-dimensional discontinuous classifiers can be approximated with a rate that is typically only achievable when approximating a low-dimensional smooth function. We demonstrate how these expression rate bounds imply fast-rate learning bounds that are close to $n^{-1}$ where $n$ is the number of samples. In addition, we carry out comprehensive numerical experimentation on binary classification problems with various margins. We study three different dimensions, with the highest dimensional problem corresponding to images from the MNIST data set.
Problem

Research questions and friction points this paper is trying to address.

High-dimensional Data Classification
ReLU Neural Networks
Learning Efficiency
Innovation

Methods, ideas, or system contributions that make the work stand out.

ReLU Neural Networks
High-Dimensional Data Classification
Accelerated Learning Process
🔎 Similar Papers
No similar papers found.