🤖 AI Summary
Asymmetric loss functions are incompatible with advanced optimization frameworks (e.g., APL) under label noise, particularly in passive learning settings. Method: This paper proposes the Joint Asymmetric Loss (JAL) framework, centered on a novel Asymmetric Mean Squared Error (AMSE) loss—extending asymmetry to passive learning for the first time—and a unified active–passive joint optimization mechanism satisfying asymmetric loss conditions. Contribution/Results: We derive necessary and sufficient conditions for JAL’s robust convergence. Extensive experiments across diverse noise types and intensities demonstrate that JAL consistently outperforms symmetric loss baselines, achieving absolute accuracy gains of 2.1–5.7 percentage points on CIFAR-10, CIFAR-100, and WebVision benchmarks. These results substantiate JAL’s enhanced robustness to noisy labels and improved generalization capability.
📝 Abstract
Learning with noisy labels is a crucial task for training accurate deep neural networks. To mitigate label noise, prior studies have proposed various robust loss functions, particularly symmetric losses. Nevertheless, symmetric losses usually suffer from the underfitting issue due to the overly strict constraint. To address this problem, the Active Passive Loss (APL) jointly optimizes an active and a passive loss to mutually enhance the overall fitting ability. Within APL, symmetric losses have been successfully extended, yielding advanced robust loss functions. Despite these advancements, emerging theoretical analyses indicate that asymmetric losses, a new class of robust loss functions, possess superior properties compared to symmetric losses. However, existing asymmetric losses are not compatible with advanced optimization frameworks such as APL, limiting their potential and applicability. Motivated by this theoretical gap and the prospect of asymmetric losses, we extend the asymmetric loss to the more complex passive loss scenario and propose the Asymetric Mean Square Error (AMSE), a novel asymmetric loss. We rigorously establish the necessary and sufficient condition under which AMSE satisfies the asymmetric condition. By substituting the traditional symmetric passive loss in APL with our proposed AMSE, we introduce a novel robust loss framework termed Joint Asymmetric Loss (JAL). Extensive experiments demonstrate the effectiveness of our method in mitigating label noise. Code available at: https://github.com/cswjl/joint-asymmetric-loss