Dense Neural Networks are not Universal Approximators

📅 2026-02-07
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study investigates whether densely connected neural networks retain universal approximation capability under practical constraints on weight magnitude and network width. By integrating insights from model compression, a weak regularity lemma, and the message-passing perspective of graph neural networks, we rigorously analyze the approximation limits of dense ReLU networks for Lipschitz continuous functions. We establish, for the first time under natural constraints, that such restricted dense networks cannot approximate all Lipschitz functions, thereby disproving their universal approximation property. Our findings further reveal that sparse connectivity is essential for achieving universal approximation, offering a theoretical explanation for the expressive limitations of dense architectures under realistic resource constraints.

Technology Category

Application Category

📝 Abstract
We investigate the approximation capabilities of dense neural networks. While universal approximation theorems establish that sufficiently large architectures can approximate arbitrary continuous functions if there are no restrictions on the weight values, we show that dense neural networks do not possess this universality. Our argument is based on a model compression approach, combining the weak regularity lemma with an interpretation of feedforward networks as message passing graph neural networks. We consider ReLU neural networks subject to natural constraints on weights and input and output dimensions, which model a notion of dense connectivity. Within this setting, we demonstrate the existence of Lipschitz continuous functions that cannot be approximated by such networks. This highlights intrinsic limitations of neural networks with dense layers and motivates the use of sparse connectivity as a necessary ingredient for achieving true universality.
Problem

Research questions and friction points this paper is trying to address.

dense neural networks
universal approximation
Lipschitz continuous functions
model compression
sparse connectivity
Innovation

Methods, ideas, or system contributions that make the work stand out.

dense neural networks
universal approximation
model compression
weak regularity lemma
sparse connectivity
🔎 Similar Papers
No similar papers found.