Beyond Loss Values: Robust Dynamic Pruning via Loss Trajectory Alignment

📅 2026-04-08
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the vulnerability of existing dynamic data pruning methods to label noise, which often mistakenly retain high-loss noisy samples and suffer significant performance degradation. To mitigate this issue, the authors propose AlignPrune, a plug-and-play module that introduces a novel dynamic alignment score (DAS) derived from loss trajectories as a pruning criterion. Without altering model architecture or training procedures, AlignPrune accurately identifies and removes noisy samples. The method seamlessly integrates into mainstream dynamic pruning frameworks, substantially enhancing their robustness on noisy data. Extensive experiments across five benchmark datasets demonstrate consistent improvements under various noise types and pruning ratios, with accuracy gains up to 6.3% over current state-of-the-art approaches.
📝 Abstract
Existing dynamic data pruning methods often fail under noisy-label settings, as they typically rely on per-sample loss as the ranking criterion. This could mistakenly lead to preserving noisy samples due to their high loss values, resulting in significant performance drop. To address this, we propose AlignPrune, a noise-robust module designed to enhance the reliability of dynamic pruning under label noise. Specifically, AlignPrune introduces the Dynamic Alignment Score (DAS), which is a loss-trajectory-based criterion that enables more accurate identification of noisy samples, thereby improving pruning effectiveness. As a simple yet effective plug-and-play module, AlignPrune can be seamlessly integrated into state-of-the-art dynamic pruning frameworks, consistently outperforming them without modifying either the model architecture or the training pipeline. Extensive experiments on five widely-used benchmarks across various noise types and pruning ratios demonstrate the effectiveness of AlignPrune, boosting accuracy by up to 6.3\% over state-of-the-art baselines. Our results offer a generalizable solution for pruning under noisy data, encouraging further exploration of learning in real-world scenarios. Code is available at: https://github.com/leonqin430/AlignPrune.
Problem

Research questions and friction points this paper is trying to address.

dynamic pruning
label noise
loss trajectory
noisy samples
data pruning
Innovation

Methods, ideas, or system contributions that make the work stand out.

dynamic pruning
label noise
loss trajectory alignment
plug-and-play module
noisy sample identification
🔎 Similar Papers
No similar papers found.
Huaiyuan Qin
Huaiyuan Qin
Institute for Infocomm Research (I2R), A*STAR, Singapore
Computer VisionDeep Learning
Muli Yang
Muli Yang
Institute for Infocomm Research (I2R), A*STAR, Singapore
Computer VisionMachine LearningOpen-World LearningMultimodal Modeling
G
Gabriel James Goenawan
Institute for Infocomm Research (I2R), A*STAR, Singapore
K
Kai Wang
National University of Singapore
Zheng Wang
Zheng Wang
Wuhan University
Multimedia Content AnalysisComputer VisionArtificial Intelligence
P
Peng Hu
Sichuan University
X
Xi Peng
Sichuan University
H
Hongyuan Zhu
Institute for Infocomm Research (I2R), A*STAR, Singapore