Parameterized Hardness of Zonotope Containment and Neural Network Verification

๐Ÿ“… 2025-09-26
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
This paper investigates the parameterized computational complexity of fundamental properties of two-layer ReLU neural networks. It focuses on four problems: (i) positivity (and surjectivity) checking, (ii) zonotope containment, (iii) exact and approximate computation of the $L_p$-Lipschitz constant for $p in (0,infty]$, and (iv) Lipschitz estimation for three-layer networks. Using carefully constructed parameterized reductions, the authors establish, for the first time, W[1]-hardness of positivity and zonotope containment with respect to input dimension $d$, thereby filling a key theoretical gap. They also prove tight W[1]-hard or NP-hard lower bounds for all four problems. These results reveal the intrinsic computational hardness of basic analytical tasks for high-dimensional ReLU networks and provide new complexity-theoretic foundations for neural network verification, geometric reasoning, and robust control.

Technology Category

Application Category

๐Ÿ“ Abstract
Neural networks with ReLU activations are a widely used model in machine learning. It is thus important to have a profound understanding of the properties of the functions computed by such networks. Recently, there has been increasing interest in the (parameterized) computational complexity of determining these properties. In this work, we close several gaps and resolve an open problem posted by Froese et al. [COLT '25] regarding the parameterized complexity of various problems related to network verification. In particular, we prove that deciding positivity (and thus surjectivity) of a function $fcolonmathbb{R}^d omathbb{R}$ computed by a 2-layer ReLU network is W[1]-hard when parameterized by $d$. This result also implies that zonotope (non-)containment is W[1]-hard with respect to $d$, a problem that is of independent interest in computational geometry, control theory, and robotics. Moreover, we show that approximating the maximum within any multiplicative factor in 2-layer ReLU networks, computing the $L_p$-Lipschitz constant for $pin(0,infty]$ in 2-layer networks, and approximating the $L_p$-Lipschitz constant in 3-layer networks are NP-hard and W[1]-hard with respect to $d$. Notably, our hardness results are the strongest known so far and imply that the naive enumeration-based methods for solving these fundamental problems are all essentially optimal under the Exponential Time Hypothesis.
Problem

Research questions and friction points this paper is trying to address.

Proving W[1]-hardness for verifying 2-layer ReLU network properties
Establishing parameterized hardness of zonotope containment problems
Showing optimality of enumeration methods for neural network verification
Innovation

Methods, ideas, or system contributions that make the work stand out.

Proving W[1]-hardness for ReLU network positivity
Establishing parameterized hardness for zonotope containment
Showing optimality of enumeration methods via ETH
๐Ÿ”Ž Similar Papers
No similar papers found.
V
Vincent Froese
Technische Universitรคt Berlin
M
Moritz Grillo
Max Planck Institute for Mathematics in the Sciences
Christoph Hertrich
Christoph Hertrich
University of Technology Nuremberg
M
Moritz Stargalla
University of Technology Nuremberg