🤖 AI Summary
Hyperspectral unmixing is severely hindered by scale-induced spectral variability—arising from terrain topography, illumination conditions, and shadow effects—leading to degraded estimation accuracy and poor convergence. This work presents the first systematic mathematical formulation of such variability and proposes a novel pre-processing framework grounded in geometric modeling and radiative transfer analysis. By formulating and solving an optimization problem, the method isolates and compensates for large-scale multiplicative distortions, thereby achieving spectral response scale correction. The approach is generic and seamlessly integrates into diverse unmixing pipelines. Extensive experiments on two synthetic and two real hyperspectral datasets demonstrate that the proposed pre-processing reduces abundance estimation errors of mainstream unmixing algorithms by approximately 50% on average, significantly enhancing unmixing accuracy, robustness, and convergence stability.
📝 Abstract
Spectral variability significantly impacts the accuracy and convergence of hyperspectral unmixing algorithms. While many methods address complex spectral variability, large-scale variations in spectral signature scale caused by factors such as topography, illumination, and shadowing remain a major challenge. These variations often degrade unmixing performance and complicate model fitting. In this paper, we propose a novel preprocessing algorithm that corrects scale-induced spectral variability prior to unmixing. By isolating and compensating for these large-scale multiplicative effects, the algorithm provides a cleaner input, enabling unmixing methods to focus more effectively on modeling nonlinear spectral variability and abundance estimation. We present a rigorous mathematical framework to describe scale variability and extensive experimental validation of the proposed algorithm. Furthermore, the algorithm's impact is evaluated across a broad spectrum of state-of-the-art unmixing algorithms on two synthetic and two real hyperspectral datasets. The proposed preprocessing step consistently improves the performance of these algorithms, including those specifically designed to handle spectral variability, with error reductions close to 50% in many cases. This demonstrates that scale correction acts as a complementary step, facilitating more accurate unmixing by existing methods. The algorithm's generality and significant impact highlight its potential as a key component in practical hyperspectral unmixing pipelines. The implementation code will be made publicly available upon publication.