SMARTIES: Spectrum-Aware Multi-Sensor Auto-Encoder for Remote Sensing Images

📅 2025-06-24
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing deep learning models for remote sensing are typically designed for fixed sensors, exhibiting poor generalization and limited cross-sensor transferability. To address this, we propose the Spectral-Aware Multi-Sensor Autoencoder (SAMA), the first universal, band-aware multi-sensor self-supervised architecture. SAMA employs a unified Transformer backbone augmented with spectral-aware positional embeddings, cross-sensor token mixing, and masked signal reconstruction—enabling seamless adaptation to arbitrary band combinations without architectural modification. By jointly modeling spectral characteristics and sensor-specific variations, SAMA achieves adaptive feature modulation and end-to-end learning over heterogeneous remote sensing data. Extensive experiments demonstrate that SAMA significantly outperforms sensor-specific pre-trained models on both single-modal and multi-modal downstream tasks—including land-cover classification, change detection, and semantic segmentation—while markedly improving cross-platform transfer performance and deployment flexibility.

Technology Category

Application Category

📝 Abstract
From optical sensors to microwave radars, leveraging the complementary strengths of remote sensing (RS) sensors is crucial for achieving dense spatio-temporal monitoring of our planet. In contrast, recent deep learning models, whether task-specific or foundational, are often specific to single sensors or to fixed combinations: adapting such models to different sensory inputs requires both architectural changes and re-training, limiting scalability and generalization across multiple RS sensors. On the contrary, a single model able to modulate its feature representations to accept diverse sensors as input would pave the way to agile and flexible multi-sensor RS data processing. To address this, we introduce SMARTIES, a generic and versatile foundation model lifting sensor-specific/dependent efforts and enabling scalability and generalization to diverse RS sensors: SMARTIES projects data from heterogeneous sensors into a shared spectrum-aware space, enabling the use of arbitrary combinations of bands both for training and inference. To obtain sensor-agnostic representations, we train a single, unified transformer model reconstructing masked multi-sensor data with cross-sensor token mixup. On both single- and multi-modal tasks across diverse sensors, SMARTIES outperforms previous models that rely on sensor-specific pretraining. Our code and pretrained models are available at https://gsumbul.github.io/SMARTIES.
Problem

Research questions and friction points this paper is trying to address.

Enables multi-sensor remote sensing with single model
Eliminates need for sensor-specific model retraining
Projects heterogeneous sensor data into shared space
Innovation

Methods, ideas, or system contributions that make the work stand out.

Projects data into shared spectrum-aware space
Uses cross-sensor token mixup for reconstruction
Unified transformer model for multi-sensor processing