Joint Asymmetric Loss for Learning with Noisy Labels

📅 2025-07-23
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Asymmetric loss functions are incompatible with advanced optimization frameworks (e.g., APL) under label noise, particularly in passive learning settings. Method: This paper proposes the Joint Asymmetric Loss (JAL) framework, centered on a novel Asymmetric Mean Squared Error (AMSE) loss—extending asymmetry to passive learning for the first time—and a unified active–passive joint optimization mechanism satisfying asymmetric loss conditions. Contribution/Results: We derive necessary and sufficient conditions for JAL’s robust convergence. Extensive experiments across diverse noise types and intensities demonstrate that JAL consistently outperforms symmetric loss baselines, achieving absolute accuracy gains of 2.1–5.7 percentage points on CIFAR-10, CIFAR-100, and WebVision benchmarks. These results substantiate JAL’s enhanced robustness to noisy labels and improved generalization capability.

Technology Category

Application Category

📝 Abstract
Learning with noisy labels is a crucial task for training accurate deep neural networks. To mitigate label noise, prior studies have proposed various robust loss functions, particularly symmetric losses. Nevertheless, symmetric losses usually suffer from the underfitting issue due to the overly strict constraint. To address this problem, the Active Passive Loss (APL) jointly optimizes an active and a passive loss to mutually enhance the overall fitting ability. Within APL, symmetric losses have been successfully extended, yielding advanced robust loss functions. Despite these advancements, emerging theoretical analyses indicate that asymmetric losses, a new class of robust loss functions, possess superior properties compared to symmetric losses. However, existing asymmetric losses are not compatible with advanced optimization frameworks such as APL, limiting their potential and applicability. Motivated by this theoretical gap and the prospect of asymmetric losses, we extend the asymmetric loss to the more complex passive loss scenario and propose the Asymetric Mean Square Error (AMSE), a novel asymmetric loss. We rigorously establish the necessary and sufficient condition under which AMSE satisfies the asymmetric condition. By substituting the traditional symmetric passive loss in APL with our proposed AMSE, we introduce a novel robust loss framework termed Joint Asymmetric Loss (JAL). Extensive experiments demonstrate the effectiveness of our method in mitigating label noise. Code available at: https://github.com/cswjl/joint-asymmetric-loss
Problem

Research questions and friction points this paper is trying to address.

Mitigating label noise in deep neural networks
Extending asymmetric losses to complex optimization frameworks
Proposing Joint Asymmetric Loss for robust learning
Innovation

Methods, ideas, or system contributions that make the work stand out.

Extends asymmetric loss to passive scenario
Proposes Asymmetric Mean Square Error (AMSE)
Introduces Joint Asymmetric Loss (JAL) framework
🔎 Similar Papers
No similar papers found.
Jialiang Wang
Jialiang Wang
Research Scientist, Meta AI
Computer VisionGenerative AI
X
Xianming Liu
Harbin Institute of Technology
Xiong Zhou
Xiong Zhou
Applied Scientist, Amazon
Computer visionmachine learning
G
Gangfeng Hu
Harbin Institute of Technology
D
Deming Zhai
Harbin Institute of Technology
Junjun Jiang
Junjun Jiang
Harbin Institute of Technology
Image ProcessingComputer VisionMachine Learning
X
Xiangyang Ji
Tsinghua University