Conformal Margin Risk Minimization: An Envelope Framework for Robust Learning under Label Noise

📅 2026-04-07
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work proposes CMRM, a plug-and-play robust learning framework that operates without prior knowledge such as noise transition matrices, clean subsets, or pretrained features. It introduces, for the first time, the quantile calibration mechanism from conformal prediction into the label noise setting by modeling the confidence gap between observed and competing labels to dynamically select high-confidence samples and suppress potentially mislabeled ones. CMRM constructs a batch-level conformal quantile-based regularizer that seamlessly enhances the robustness of any classification loss. Extensive experiments demonstrate that CMRM improves average accuracy by up to 3.39% across six benchmarks with synthetic and real-world label noise, reduces conformal prediction set size by up to 20.44%, and incurs no performance degradation in noise-free settings.
📝 Abstract
Most methods for learning with noisy labels require privileged knowledge such as noise transition matrices, clean subsets or pretrained feature extractors, resources typically unavailable when robustness is most needed. We propose Conformal Margin Risk Minimization (CMRM), a plug-and-play envelope framework that improves any classification loss under label noise by adding a single quantile-calibrated regularization term, with no privileged knowledge or training pipeline modification. CMRM measures the confidence margin between the observed label and competing labels, and thresholds it with a conformal quantile estimated per batch to focus training on high-margin samples while suppressing likely mislabeled ones. We derive a learning bound for CMRM under arbitrary label noise requiring only mild regularity of the margin distribution. Across five base methods and six benchmarks with synthetic and real-world noise, CMRM consistently improves accuracy (up to +3.39%), reduces conformal prediction set size (up to -20.44%) and does not hurt under 0% noise, showing that CMRM captures a method-agnostic uncertainty signal that existing mechanisms did not exploit.
Problem

Research questions and friction points this paper is trying to address.

label noise
robust learning
conformal prediction
margin risk
classification
Innovation

Methods, ideas, or system contributions that make the work stand out.

Conformal Prediction
Label Noise
Margin-based Learning
Robust Learning
Quantile Calibration
🔎 Similar Papers
No similar papers found.
Y
Yuanjie Shi
School of EECS, Washington State University, Pullman, WA, USA
P
Peihong Li
School of EECS, Washington State University, Pullman, WA, USA
Z
Zijian Zhang
School of EECS, Washington State University, Pullman, WA, USA
J
Janardhan Rao Doppa
School of EECS, Washington State University, Pullman, WA, USA
Yan Yan
Yan Yan
Washington State University
Machine LearningComputer Vision