Adaptive Diffusion Denoised Smoothing : Certified Robustness via Randomized Smoothing with Differentially Private Guided Denoising Diffusion

📅 2025-07-10
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of certifying robustness of vision models against adversarial examples. We propose an adaptive randomized smoothing method guided by denoising diffusion, grounded in differential privacy. Our core contribution is the first reinterpretation of the guided denoising diffusion process as a sequence of input-adaptive Gaussian Differential Privacy (GDP) mechanisms; we further design a GDP privacy filter to enable end-to-end provable robustness certification, thereby extending the theoretical framework of adaptive randomized smoothing. The method integrates guided diffusion modeling, adaptive noise injection, and compositional GDP analysis. Evaluated on ImageNet under the ℓ₂ threat model, it simultaneously improves both certified accuracy and standard (clean) accuracy, achieving a favorable trade-off between robustness and generalization.

Technology Category

Application Category

📝 Abstract
We propose Adaptive Diffusion Denoised Smoothing, a method for certifying the predictions of a vision model against adversarial examples, while adapting to the input. Our key insight is to reinterpret a guided denoising diffusion model as a long sequence of adaptive Gaussian Differentially Private (GDP) mechanisms refining a pure noise sample into an image. We show that these adaptive mechanisms can be composed through a GDP privacy filter to analyze the end-to-end robustness of the guided denoising process, yielding a provable certification that extends the adaptive randomized smoothing analysis. We demonstrate that our design, under a specific guiding strategy, can improve both certified accuracy and standard accuracy on ImageNet for an $ell_2$ threat model.
Problem

Research questions and friction points this paper is trying to address.

Certify vision model predictions against adversarial examples
Adapt to input using differentially private guided diffusion
Improve certified and standard accuracy on ImageNet
Innovation

Methods, ideas, or system contributions that make the work stand out.

Adaptive Diffusion Denoised Smoothing for robustness
GDP privacy filter for guided denoising
Improves certified and standard accuracy
🔎 Similar Papers
No similar papers found.
F
Frederick Shpilevskiy
University of British Columbia, Vancouver, Canada
S
Saiyue Lyu
ServiceNow Research
K
Krishnamurthy Dj Dvijotham
ServiceNow Research
Mathias Lécuyer
Mathias Lécuyer
University of British Columbia
Machine LearningPrivacySecuritySystems
Pierre-André Noël
Pierre-André Noël
ServiceNow Research
Machine learninggraphsstochastic processes