Noise-Adaptive Regularization for Robust Multi-Label Remote Sensing Image Classification

📅 2026-01-13
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the performance degradation in multi-label remote sensing image classification caused by additive, subtractive, and mixed label noise introduced through automated or crowdsourced annotation. To mitigate this issue, the paper proposes a Noise-Adaptive Regularization (NAR) method that, within a semi-supervised learning framework, explicitly distinguishes among multiple types of label noise for the first time. NAR employs a confidence-driven dynamic label refinement mechanism that selectively retains, suspends, or flips labels, integrated with Early Learning Regularization (ELR) to stabilize training. Extensive experiments demonstrate that NAR consistently outperforms existing approaches across various noise settings, achieving particularly significant improvements under subtractive and mixed noise conditions, thereby substantially enhancing the robustness of multi-label classification in remote sensing applications.

Technology Category

Application Category

📝 Abstract
The development of reliable methods for multi-label classification (MLC) has become a prominent research direction in remote sensing (RS). As the scale of RS data continues to expand, annotation procedures increasingly rely on thematic products or crowdsourced procedures to reduce the cost of manual annotation. While cost-effective, these strategies often introduce multi-label noise in the form of partially incorrect annotations. In MLC, label noise arises as additive noise, subtractive noise, or a combination of both in the form of mixed noise. Previous work has largely overlooked this distinction and commonly treats noisy annotations as supervised signals, lacking mechanisms that explicitly adapt learning behavior to different noise types. To address this limitation, we propose NAR, a noise-adaptive regularization method that explicitly distinguishes between additive and subtractive noise within a semi-supervised learning framework. NAR employs a confidence-based label handling mechanism that dynamically retains label entries with high confidence, temporarily deactivates entries with moderate confidence, and corrects low confidence entries via flipping. This selective attenuation of supervision is integrated with early-learning regularization (ELR) to stabilize training and mitigate overfitting to corrupted labels. Experiments across additive, subtractive, and mixed noise scenarios demonstrate that NAR consistently improves robustness compared with existing methods. Performance improvements are most pronounced under subtractive and mixed noise, indicating that adaptive suppression and selective correction of noisy supervision provide an effective strategy for noise robust learning in RS MLC.
Problem

Research questions and friction points this paper is trying to address.

multi-label classification
label noise
remote sensing
additive noise
subtractive noise
Innovation

Methods, ideas, or system contributions that make the work stand out.

noise-adaptive regularization
multi-label classification
label noise
confidence-based label handling
remote sensing
🔎 Similar Papers
No similar papers found.
Tom Burgert
Tom Burgert
Technische Universität Berlin, Berlin Institute for the Foundations of Learning and Data
Deep LearningRemote Sensing
J
Julia Henkel
Berlin Institute for the Foundations of Learning and Data (BIFOLD) and Faculty of Electrical Engineering and Computer Science, Technische Universität Berlin, 10623 Berlin, Germany
B
Begum Demir
Berlin Institute for the Foundations of Learning and Data (BIFOLD) and Faculty of Electrical Engineering and Computer Science, Technische Universität Berlin, 10623 Berlin, Germany