Variation-Bounded Loss for Noise-Tolerant Learning

📅 2025-11-15
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address performance degradation caused by label noise in supervised learning, this paper proposes a family of robust loss functions characterized by a Bounded Variation Ratio (BVR). Departing from conventional symmetry-based conditions, we introduce the variation ratio as a novel theoretical criterion for quantifying loss robustness and rigorously establish its intrinsic connection to noise tolerance. Based on this insight, we develop a concise and scalable framework for constructing asymmetric robust losses, unifying standard losses—including cross-entropy and mean absolute error—into BVR-bounded forms. Theoretical analysis proves that BVR-bounded losses satisfy sufficient conditions for noise robustness. Extensive experiments on CIFAR-10/100 and WebVision demonstrate that our approach significantly improves generalization accuracy and training stability under diverse noise settings, including symmetric, asymmetric, and instance-dependent label noise.

Technology Category

Application Category

📝 Abstract
Mitigating the negative impact of noisy labels has been aperennial issue in supervised learning. Robust loss functions have emerged as a prevalent solution to this problem. In this work, we introduce the Variation Ratio as a novel property related to the robustness of loss functions, and propose a new family of robust loss functions, termed Variation-Bounded Loss (VBL), which is characterized by a bounded variation ratio. We provide theoretical analyses of the variation ratio, proving that a smaller variation ratio would lead to better robustness. Furthermore, we reveal that the variation ratio provides a feasible method to relax the symmetric condition and offers a more concise path to achieve the asymmetric condition. Based on the variation ratio, we reformulate several commonly used loss functions into a variation-bounded form for practical applications. Positive experiments on various datasets exhibit the effectiveness and flexibility of our approach.
Problem

Research questions and friction points this paper is trying to address.

Developing noise-tolerant learning methods for supervised learning with noisy labels
Proposing robust loss functions with bounded variation ratio for better robustness
Reformulating existing loss functions to handle asymmetric noise conditions effectively
Innovation

Methods, ideas, or system contributions that make the work stand out.

Introduces Variation Ratio for robust loss functions
Proposes Variation-Bounded Loss with bounded variation ratio
Reformulates common loss functions into variation-bounded form
🔎 Similar Papers
No similar papers found.