Exact Upper and Lower Bounds for the Output Distribution of Neural Networks with Random Inputs

📅 2025-02-17
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the problem of deriving tight upper and lower bounds on the cumulative distribution function (CDF) of neural network outputs under stochastic inputs. The proposed method introduces a general bounding framework based on ReLU surrogate models, extending distributional bounding theory for the first time to networks incorporating convolutional layers and diverse activation functions—including ReLU, tanh, and softmax. It integrates properties of piecewise monotonic activations, ReLU-network equivalence approximation, probabilistic bound propagation, and high-resolution convergence analysis. Experiments demonstrate that the approach yields theoretically guaranteed CDF bounds over the entire support, enabling verifiable upper bounds on prediction error. Moreover, bound tightness converges to the true CDF as resolution increases, significantly enhancing the certifiable safety of neural network output uncertainty quantification.

Technology Category

Application Category

📝 Abstract
We derive exact upper and lower bounds for the cumulative distribution function (cdf) of the output of a neural network over its entire support subject to noisy (stochastic) inputs. The upper and lower bounds converge to the true cdf over its domain as the resolution increases. Our method applies to any feedforward NN using continuous monotonic piecewise differentiable activation functions (e.g., ReLU, tanh and softmax) and convolutional NNs, which were beyond the scope of competing approaches. The novelty and an instrumental tool of our approach is to bound general NNs with ReLU NNs. The ReLU NN based bounds are then used to derive upper and lower bounds of the cdf of the NN output. Experiments demonstrate that our method delivers guaranteed bounds of the predictive output distribution over its support, thus providing exact error guarantees, in contrast to competing approaches.
Problem

Research questions and friction points this paper is trying to address.

Exact bounds for neural network outputs
Handling stochastic inputs precisely
Applicable to various neural network types
Innovation

Methods, ideas, or system contributions that make the work stand out.

Exact bounds for neural outputs
ReLU NN-based bounding technique
Applicable to various activation functions
🔎 Similar Papers
No similar papers found.
A
Andrey Kofnov
Institute of Statistics and Mathematical Methods in Economics, Faculty of Mathematics and Geoinformation, TU Wien, Vienna, Austria
D
Daniel Kapla
Institute of Statistics and Mathematical Methods in Economics, Faculty of Mathematics and Geoinformation, TU Wien, Vienna, Austria
Ezio Bartocci
Ezio Bartocci
Faculty of Informatics, TU Wien
Cyber-Physical SystemsInternet of ThingsRuntime VerificationCyber-securitySafe AI
E
Efstathia Bura
Institute of Statistics and Mathematical Methods in Economics, Faculty of Mathematics and Geoinformation, TU Wien, Vienna, Austria