๐ค AI Summary
This work addresses the false sense of data privacy induced by unlearnable examplesโsamples perturbed to cause premature overfitting to spurious features while suppressing semantic learning. We propose a progressive, multi-stage training paradigm grounded in our novel observation that models initially learn both perturbation and semantic features, yet shallow layers rapidly overfit to perturbations. To break this unlearnability bottleneck, we design a dynamic hierarchical freezing/unfreezing mechanism. Our method integrates progressive layered network training, adaptive parameter scheduling, and multi-stage loss formulation, and is compatible with mainstream architectures including CNNs, ResNets, and Vision Transformers (ViTs). Extensive experiments on CIFAR-10/100 and ImageNet-mini demonstrate substantial improvements over existing defenses, establishing our approach as a new benchmark for evaluating unlearnability mitigation techniques.
๐ Abstract
Unlearning techniques are proposed to prevent third parties from exploiting unauthorized data, which generate unlearnable samples by adding imperceptible perturbations to data for public publishing. These unlearnable samples effectively misguide model training to learn perturbation features but ignore image semantic features. We make the in-depth analysis and observe that models can learn both image features and perturbation features of unlearnable samples at an early stage, but rapidly go to the overfitting stage since the shallow layers tend to overfit on perturbation features and make models fall into overfitting quickly. Based on the observations, we propose Progressive Staged Training to effectively prevent models from overfitting in learning perturbation features. We evaluated our method on multiple model architectures over diverse datasets, e.g., CIFAR-10, CIFAR-100, and ImageNet-mini. Our method circumvents the unlearnability of all state-of-the-art methods in the literature and provides a reliable baseline for further evaluation of unlearnable techniques.