The Computational Complexity of Counting Linear Regions in ReLU Neural Networks

๐Ÿ“… 2025-05-22
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
This paper systematically investigates the computational complexity of counting linear regions in ReLU neural networks. Addressing multiple non-equivalent definitions of linear regions in prior literature, we first rigorously disambiguate their semantic distinctions and logical relationships, establishing the first formal complexity classification framework. We prove that counting linear regions is both NP-hard and #P-hard for single-hidden-layer networks, andโ€”in the multi-layer caseโ€”admits no polynomial-factor approximation unless P = NP. Furthermore, we devise a PSPACE-exact counting algorithm applicable to mainstream definitions. Our main contributions are: (i) resolving conceptual ambiguities surrounding linear-region definitions; (ii) precisely characterizing computational hardness boundaries; and (iii) providing a feasible, exact algorithm. These results lay a rigorous theoretical foundation for analyzing neural network expressivity and informing principled architectural design.

Technology Category

Application Category

๐Ÿ“ Abstract
An established measure of the expressive power of a given ReLU neural network is the number of linear regions into which it partitions the input space. There exist many different, non-equivalent definitions of what a linear region actually is. We systematically assess which papers use which definitions and discuss how they relate to each other. We then analyze the computational complexity of counting the number of such regions for the various definitions. Generally, this turns out to be an intractable problem. We prove NP- and #P-hardness results already for networks with one hidden layer and strong hardness of approximation results for two or more hidden layers. Finally, on the algorithmic side, we demonstrate that counting linear regions can at least be achieved in polynomial space for some common definitions.
Problem

Research questions and friction points this paper is trying to address.

Analyzing computational complexity of counting ReLU network linear regions
Comparing non-equivalent definitions of linear regions in neural networks
Proving NP and #P-hardness for counting regions in shallow networks
Innovation

Methods, ideas, or system contributions that make the work stand out.

Systematically assess definitions of linear regions
Analyze computational complexity of counting regions
Prove NP- and #P-hardness for neural networks
M
Moritz Stargalla
University of Technology Nuremberg
Christoph Hertrich
Christoph Hertrich
University of Technology Nuremberg
D
Daniel Reichman
Worcester Polytechnic Institute