Neural Network Approximation: A View from Polytope Decomposition

๐Ÿ“… 2026-01-26
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
This work addresses the limitations of existing universal approximation theories for neural networks, which typically rely on uniform hypercube partitions and struggle to capture the local irregularities of target functions near singularities. To overcome this, the authors propose a task-oriented approximation strategy based on polyhedral decomposition, integrating kernel polynomial constructions with Totikโ€“Ditzian-type moduli of continuity. Within each subdomain, ReLU networks are individually tailored to the local geometry and regularity of the function. This approach significantly enhances approximation efficiency and flexibility in regions containing singularities and achieves faster convergence rates for analytic functions compared to conventional uniform partitioning methods.

Technology Category

Application Category

๐Ÿ“ Abstract
Universal approximation theory offers a foundational framework to verify neural network expressiveness, enabling principled utilization in real-world applications. However, most existing theoretical constructions are established by uniformly dividing the input space into tiny hypercubes without considering the local regularity of the target function. In this work, we investigate the universal approximation capabilities of ReLU networks from a view of polytope decomposition, which offers a more realistic and task-oriented approach compared to current methods. To achieve this, we develop an explicit kernel polynomial method to derive an universal approximation of continuous functions, which is characterized not only by the refined Totik-Ditzian-type modulus of continuity, but also by polytopical domain decomposition. Then, a ReLU network is constructed to approximate the kernel polynomial in each subdomain separately. Furthermore, we find that polytope decomposition makes our approximation more efficient and flexible than existing methods in many cases, especially near singular points of the objective function. Lastly, we extend our approach to analytic functions to reach a higher approximation rate.
Problem

Research questions and friction points this paper is trying to address.

universal approximation
ReLU networks
polytope decomposition
local regularity
singular points
Innovation

Methods, ideas, or system contributions that make the work stand out.

polytope decomposition
ReLU networks
universal approximation
kernel polynomial method
modulus of continuity
๐Ÿ”Ž Similar Papers
No similar papers found.