Neural Network Verification with Branch-and-Bound for General Nonlinearities

📅 2024-05-31
🏛️ arXiv.org
📈 Citations: 6
Influential: 0
📄 PDF
🤖 AI Summary
Existing branch-and-bound (BaB) methods are restricted to piecewise-linear networks (e.g., ReLU) and fail to scale to general nonlinearities—such as Sigmoid, Tanh, GeLU, or multiplicative operations—as well as complex computational graphs including LSTMs, Vision Transformers (ViTs), and AC optimal power flow (ACOPF) systems. This work introduces GenBaB, the first BaB framework for general nonlinear neural network verification. Its core contributions are: (1) a unified differentiable linear bound propagation scheme for nonlinear activations and multidimensional operations; (2) an efficient branching heuristic leveraging pre-optimized branching points and lookup-table acceleration; and (3) deep integration with the α,β-CROWN engine. GenBaB achieved first place in VNN-COMP 2023 and 2024, enabling the first scalable, high-precision verification of Sigmoid/Tanh/GeLU networks, multiplication-rich temporal and vision models, and ACOPF systems—significantly improving convergence speed and certified robustness rates.

Technology Category

Application Category

📝 Abstract
Branch-and-bound (BaB) is among the most effective techniques for neural network (NN) verification. However, existing works on BaB for NN verification have mostly focused on NNs with piecewise linear activations, especially ReLU networks. In this paper, we develop a general framework, named GenBaB, to conduct BaB on general nonlinearities to verify NNs with general architectures, based on linear bound propagation for NN verification. To decide which neuron to branch, we design a new branching heuristic which leverages linear bounds as shortcuts to efficiently estimate the potential improvement after branching. To decide nontrivial branching points for general nonlinear functions, we propose to pre-optimize branching points, which can be efficiently leveraged during verification with a lookup table. We demonstrate the effectiveness of our GenBaB on verifying a wide range of NNs, including NNs with activation functions such as Sigmoid, Tanh, Sine and GeLU, as well as NNs involving multi-dimensional nonlinear operations such as multiplications in LSTMs and Vision Transformers. Our framework also allows the verification of general nonlinear computation graphs and enables verification applications beyond simple NNs, particularly for AC Optimal Power Flow (ACOPF). GenBaB is part of the latest $alpha$,$eta$-CROWN, the winner of the 4th and the 5th International Verification of Neural Networks Competition (VNN-COMP 2023 and 2024). Code for reproducing the experiments is available at https://github.com/shizhouxing/GenBaB.
Problem

Research questions and friction points this paper is trying to address.

Extends BaB to verify NNs with general nonlinearities
Introduces heuristic for efficient branching decisions
Enables verification of complex nonlinear computation graphs
Innovation

Methods, ideas, or system contributions that make the work stand out.

General framework for NN verification
New branching heuristic using linear bounds
Pre-optimized branching points for nonlinear functions
🔎 Similar Papers
No similar papers found.