Clip-and-Verify: Linear Constraint-Driven Domain Clipping for Accelerating Neural Network Verification

📅 2025-12-11
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Branch-and-bound (BaB) methods for neural network formal verification suffer from low efficiency, excessive subproblem generation, and overly loose intermediate-layer bound relaxations. Method: This paper proposes a linear-constraint-driven input-domain pruning framework. Its core innovation lies in directly leveraging propagated linear constraints to dynamically prune input subregions that are either verified or provably unsatisfiable, while simultaneously tightening layer-wise activation bounds. We design a lightweight, GPU-native pruning operator that avoids external solvers and enables end-to-end acceleration. The framework is tightly integrated into mainstream BaB verifiers such as α,β-CROWN. Results: Experiments across multiple benchmarks show up to 96% reduction in subproblem count, significant improvements in both verification speed and precision, state-of-the-art performance on several key metrics, and instrumental support for our team’s championship win at VNN-COMP 2025.

Technology Category

Application Category

📝 Abstract
State-of-the-art neural network (NN) verifiers demonstrate that applying the branch-and-bound (BaB) procedure with fast bounding techniques plays a key role in tackling many challenging verification properties. In this work, we introduce the linear constraint-driven clipping framework, a class of scalable and efficient methods designed to enhance the efficacy of NN verifiers. Under this framework, we develop two novel algorithms that efficiently utilize linear constraints to 1) reduce portions of the input space that are either verified or irrelevant to a subproblem in the context of branch-and-bound, and 2) directly improve intermediate bounds throughout the network. The process novelly leverages linear constraints that often arise from bound propagation methods and is general enough to also incorporate constraints from other sources. It efficiently handles linear constraints using a specialized GPU procedure that can scale to large neural networks without the use of expensive external solvers. Our verification procedure, Clip-and-Verify, consistently tightens bounds across multiple benchmarks and can significantly reduce the number of subproblems handled during BaB. We show that our clipping algorithms can be integrated with BaB-based verifiers such as $α,β$-CROWN, utilizing either the split constraints in activation-space BaB or the output constraints that denote the unverified input space. We demonstrate the effectiveness of our procedure on a broad range of benchmarks where, in some instances, we witness a 96% reduction in the number of subproblems during branch-and-bound, and also achieve state-of-the-art verified accuracy across multiple benchmarks. Clip-and-Verify is part of the $α,β$-CROWN verifier (http://abcrown.org), the VNN-COMP 2025 winner. Code available at https://github.com/Verified-Intelligence/Clip_and_Verify.
Problem

Research questions and friction points this paper is trying to address.

Accelerates neural network verification via linear constraint-driven domain clipping
Reduces input space portions irrelevant to branch-and-bound subproblems
Tightens intermediate bounds across networks without expensive external solvers
Innovation

Methods, ideas, or system contributions that make the work stand out.

Linear constraint-driven clipping accelerates neural network verification
Specialized GPU procedure handles constraints without expensive external solvers
Reduces subproblems in branch-and-bound by tightening bounds efficiently
🔎 Similar Papers
No similar papers found.