Compression Aware Certified Training

📅 2025-06-13
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
For safety-critical and resource-constrained applications, deep neural networks must simultaneously achieve efficiency and provable robustness; however, existing approaches decouple model compression (e.g., pruning, quantization) from robustness certification, leading to suboptimal trade-offs. Method: We propose CACTUS—the first trainable framework unifying compression and certified robustness—by integrating structured pruning constraints and quantization-aware training into a single end-to-end optimization process, and jointly optimizing robustness and sparse/low-precision representations via ensemble-based network modeling. Contribution/Results: Breaking the conventional two-stage paradigm, CACTUS achieves significant gains in certified accuracy and robustness under pruning and quantization across multiple benchmarks—including CIFAR-10 and ImageNet—and under diverse input perturbations, consistently outperforming state-of-the-art sequential methods.

Technology Category

Application Category

📝 Abstract
Deep neural networks deployed in safety-critical, resource-constrained environments must balance efficiency and robustness. Existing methods treat compression and certified robustness as separate goals, compromising either efficiency or safety. We propose CACTUS (Compression Aware Certified Training Using network Sets), a general framework for unifying these objectives during training. CACTUS models maintain high certified accuracy even when compressed. We apply CACTUS for both pruning and quantization and show that it effectively trains models which can be efficiently compressed while maintaining high accuracy and certifiable robustness. CACTUS achieves state-of-the-art accuracy and certified performance for both pruning and quantization on a variety of datasets and input specifications.
Problem

Research questions and friction points this paper is trying to address.

Balancing efficiency and robustness in deep neural networks
Unifying compression and certified robustness during training
Maintaining high accuracy and robustness when compressed
Innovation

Methods, ideas, or system contributions that make the work stand out.

Unifies compression and robustness in training
Maintains high accuracy when compressed
Applies to pruning and quantization effectively
🔎 Similar Papers
No similar papers found.
Changming Xu
Changming Xu
University of Illinois Urbana Champaign
Trustworthy Machine Learning
G
Gagandeep Singh
Department of Computer Science, University of Illinois Urbana-Champaign