Combating Noisy Labels via Dynamic Connection Masking

📅 2025-08-13
📈 Citations: 0
✨ Influential: 0
📄 PDF
🤖 AI Summary
Deep neural networks suffer significant performance degradation under label noise. To address this, we propose a dynamic connection masking mechanism grounded in model architecture regularization: it adaptively prunes redundant connections based on information capacity estimation and gradient error analysis, thereby enhancing structural robustness. We introduce the Kolmogorov–Arnold Network (KAN) to noisy-label classification for the first time, empirically demonstrating its superior noise resilience over standard MLPs. Our mechanism is modular and seamlessly integrates with diverse robust training paradigms—including robust loss functions and sample selection strategies—without architectural modification. Extensive experiments on both synthetic and real-world noisy-label benchmarks show consistent and substantial improvements over state-of-the-art methods. The results validate the generalizability of our approach across network architectures and noise types, establishing a new benchmark in robust learning under label corruption.

Technology Category

Application Category

📝 Abstract
Noisy labels are inevitable in real-world scenarios. Due to the strong capacity of deep neural networks to memorize corrupted labels, these noisy labels can cause significant performance degradation. Existing research on mitigating the negative effects of noisy labels has mainly focused on robust loss functions and sample selection, with comparatively limited exploration of regularization in model architecture. Inspired by the sparsity regularization used in Kolmogorov-Arnold Networks (KANs), we propose a Dynamic Connection Masking (DCM) mechanism for both Multi-Layer Perceptron Networks (MLPs) and KANs to enhance the robustness of classifiers against noisy labels. The mechanism can adaptively mask less important edges during training by evaluating their information-carrying capacity. Through theoretical analysis, we demonstrate its efficiency in reducing gradient error. Our approach can be seamlessly integrated into various noise-robust training methods to build more robust deep networks, including robust loss functions, sample selection strategies, and regularization techniques. Extensive experiments on both synthetic and real-world benchmarks demonstrate that our method consistently outperforms state-of-the-art (SOTA) approaches. Furthermore, we are also the first to investigate KANs as classifiers against noisy labels, revealing their superior noise robustness over MLPs in real-world noisy scenarios. Our code will soon be publicly available.
Problem

Research questions and friction points this paper is trying to address.

Enhancing classifier robustness against noisy labels
Adaptively masking less important network connections
Integrating dynamic masking with noise-robust training methods
Innovation

Methods, ideas, or system contributions that make the work stand out.

Dynamic Connection Masking for noise robustness
Adaptive edge masking by information capacity
Integration with existing noise-robust training methods
🔎 Similar Papers
No similar papers found.