🤖 AI Summary
This work addresses the challenge of deploying deep neural networks on resource-constrained embedded systems, where compression techniques such as quantization and pruning are essential but may compromise behavioral consistency with the original model—particularly in safety-critical applications. To tackle this issue, the paper proposes a probabilistic certification framework that supports heterogeneous compressed architectures through a dual-network symbolic propagation mechanism. By integrating variance-aware bound analysis based on Bernstein inequalities, the method enables tight verification of behavioral similarity under adjustable confidence levels. The approach significantly improves both the tightness and scalability of certification, outperforming existing techniques on benchmarks including ACAS Xu and computer vision tasks, thereby offering efficient and reliable quantitative safety guarantees for compressed models.
📝 Abstract
Deploying Deep Neural Networks (DNNs) on resource-constrained embedded systems requires aggressive model compression techniques like quantization and pruning. However, ensuring that the compressed model preserves the behavioral fidelity of the original design is a critical challenge in the safety-critical system design flow. Existing verification methods often lack scalability or fail to handle the architectural heterogeneity introduced by pruning. In this work, we propose SimCert, a probabilistic certification framework for verifying the behavioral similarity of compressed neural networks. Unlike worst-case analysis, SimCert provides quantitative safety guarantees with adjustable confidence levels. Our framework features: (1) A dual-network symbolic propagation method supporting both quantization and pruning; (2) A variance-aware bounding technique using Bernstein's inequality to tighten safety certificates; and (3) An automated verification toolchain. Experimental results on ACAS Xu and computer vision benchmarks demonstrate that SimCert outperforms state-of-the-art baselines.