🤖 AI Summary
To address the degradation in model robustness and miscalibrated uncertainty estimation under test-time image corruptions (e.g., noise, blur), this paper proposes a degradation-aware joint training framework. It uniquely couples generative diffusion-inspired data denoising—comprising Gaussian noise injection and Gaussian blurring—with label smoothing, thereby implicitly aligning degradation severity with predictive confidence during standard training. The method incurs no additional inference overhead and seamlessly integrates with existing data augmentation pipelines. Evaluated across multiple image corruption benchmarks on CIFAR-10/100, TinyImageNet, and ImageNet, it consistently improves classification robustness (average +2.1% top-1 accuracy) and uncertainty calibration quality (reducing Expected Calibration Error by up to 38%). This yields an efficient, plug-and-play training paradigm for degradation-robust deep learning models.
📝 Abstract
Introducing training-time augmentations is a key technique to enhance generalization and prepare deep neural networks against test-time corruptions. Inspired by the success of generative diffusion models, we propose a novel approach of coupling data mollification, in the form of image noising and blurring, with label smoothing to align predicted label confidences with image degradation. The method is simple to implement, introduces negligible overheads, and can be combined with existing augmentations. We demonstrate improved robustness and uncertainty quantification on the corrupted image benchmarks of CIFAR, TinyImageNet and ImageNet datasets.