🤖 AI Summary
Deep learning models struggle to simultaneously achieve strong robustness against adaptive adversarial attacks and high natural generalization performance.
Method: This paper first identifies and empirically validates that structural priors significantly enhance model robustness under strong adaptive attacks. Building on this insight, we propose Elastic Dictionary Learning Network (EDLNet), which integrates dictionary learning into the ResNet architecture and introduces an elastic parameterization mechanism to overcome the “robustness illusion” inherent in conventional dictionary-learning-based CNNs.
Contribution/Results: Theoretically, we conduct robustness analysis using influence functions. Experimentally, EDLNet consistently achieves substantial improvements over state-of-the-art methods on mainstream robustness benchmarks—including RobustBench—while simultaneously boosting both adversarial robustness and natural generalization accuracy.
📝 Abstract
This work investigates a novel approach to boost adversarial robustness and generalization by incorporating structural prior into the design of deep learning models. Specifically, our study surprisingly reveals that existing dictionary learning-inspired convolutional neural networks (CNNs) provide a false sense of security against adversarial attacks. To address this, we propose Elastic Dictionary Learning Networks (EDLNets), a novel ResNet architecture that significantly enhances adversarial robustness and generalization. This novel and effective approach is supported by a theoretical robustness analysis using influence functions. Moreover, extensive and reliable experiments demonstrate consistent and significant performance improvement on open robustness leaderboards such as RobustBench, surpassing state-of-the-art baselines. To the best of our knowledge, this is the first work to discover and validate that structural prior can reliably enhance deep learning robustness under strong adaptive attacks, unveiling a promising direction for future research.