🤖 AI Summary
To address the poor integrity of multi-scale change objects and weak noise robustness in semi-supervised remote sensing change detection, this paper proposes a Hierarchical Scale-Aware Consistency Regularization framework. Methodologically, it introduces: (1) a Scale-Aware Differential Attention Module (SADAM) that explicitly models intra-layer multi-scale difference features; (2) integration of the SAM2 Hiera encoder with lightweight, parameter-efficient adapters to enable efficient cross-layer multi-scale representation fusion; and (3) dual-augmentation consistency regularization to enhance pseudo-label quality and noise robustness. Evaluated on four mainstream benchmarks, the method achieves state-of-the-art performance while significantly reducing model parameters and computational cost. Ablation studies and cross-dataset experiments further validate its effectiveness and practicality under limited annotation budgets.
📝 Abstract
Semi-supervised change detection (SSCD) aims to detect changes between bi-temporal remote sensing images by utilizing limited labeled data and abundant unlabeled data. Existing methods struggle in complex scenarios, exhibiting poor performance when confronted with noisy data. They typically neglect intra-layer multi-scale features while emphasizing inter-layer fusion, harming the integrity of change objects with different scales. In this paper, we propose HSACNet, a Hierarchical Scale-Aware Consistency regularized Network for SSCD. Specifically, we integrate Segment Anything Model 2 (SAM2), using its Hiera backbone as the encoder to extract inter-layer multi-scale features and applying adapters for parameter-efficient fine-tuning. Moreover, we design a Scale-Aware Differential Attention Module (SADAM) that can precisely capture intra-layer multi-scale change features and suppress noise. Additionally, a dual-augmentation consistency regularization strategy is adopted to effectively utilize the unlabeled data. Extensive experiments across four CD benchmarks demonstrate that our HSACNet achieves state-of-the-art performance, with reduced parameters and computational cost.