Multiplicative Reweighting for Robust Neural Network Optimization

๐Ÿ“… 2021-02-24
๐Ÿ›๏ธ SIAM Journal of Imaging Sciences
๐Ÿ“ˆ Citations: 9
โœจ Influential: 1
๐Ÿ“„ PDF
๐Ÿค– AI Summary
To address performance degradation of neural networks under label noise, this paper proposes a sample-level dynamic reweighting training framework based on Multiplicative Weights (MW) update. It is the first work to integrate the MW mechanism into deep learning optimization, providing theoretical convergence guarantees under gradient descent and analytically characterizing its intrinsic robustness to noisy labels in the 1D setting. Unlike existing approaches, the method requires no auxiliary networks or explicit noise modeling; instead, it adaptively adjusts sample importance via lightweight, iterative weight updates. Extensive experiments demonstrate significant improvements in classification accuracy on CIFAR-10/100 with synthetic label noise and on Clothing1M with real-world noisy labels. Moreover, the resulting models exhibit enhanced robustness against adversarial attacks. This work establishes a novel paradigm for noise-robust trainingโ€”concise, theoretically grounded, and computationally efficient.
๐Ÿ“ Abstract
Neural networks are widespread due to their powerful performance. Yet, they degrade in the presence of noisy labels at training time. Inspired by the setting of learning with expert advice, where multiplicative weights (MW) updates were recently shown to be robust to moderate data corruptions in expert advice, we propose to use MW for reweighting examples during neural networks optimization. We theoretically establish the convergence of our method when used with gradient descent and prove its advantages in 1d cases. We then validate empirically our findings for the general case by showing that MW improves neural networks'accuracy in the presence of label noise on CIFAR-10, CIFAR-100 and Clothing1M. We also show the impact of our approach on adversarial robustness.
Problem

Research questions and friction points this paper is trying to address.

Addressing neural network performance degradation from noisy training labels
Proposing multiplicative reweighting to enhance robustness against label noise
Improving model accuracy and adversarial robustness on benchmark datasets
Innovation

Methods, ideas, or system contributions that make the work stand out.

Multiplicative weights reweight examples during training
Method converges with gradient descent optimization
Improves accuracy under label noise conditions
๐Ÿ”Ž Similar Papers
No similar papers found.