Spiking Meets Attention: Efficient Remote Sensing Image Super-Resolution with Attention Spiking Neural Networks

📅 2025-03-06
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the weak representational capacity and limited exploration of spiking neural networks (SNNs) in remote sensing image super-resolution (RS-SR), this paper introduces, for the first time, an attention mechanism into the SNN framework, proposing the Spiking Attention Block (SAB). The SAB dynamically modulates membrane potentials and spike generation via joint spatiotemporal modulation, while integrating global self-similarity priors to generate spatial attention weights—thereby enhancing reconstruction fidelity and structural realism for large-scale remote sensing imagery. Our method employs channel–temporal coupled attention and event-driven feature optimization. Evaluated on benchmark datasets—including AID, DOTA, and DIOR—it achieves state-of-the-art (SOTA) performance in both quantitative metrics and visual quality. Moreover, it significantly reduces energy consumption and computational overhead, enabling synergistic optimization of accuracy and energy efficiency.

Technology Category

Application Category

📝 Abstract
Spiking neural networks (SNNs) are emerging as a promising alternative to traditional artificial neural networks (ANNs), offering biological plausibility and energy efficiency. Despite these merits, SNNs are frequently hampered by limited capacity and insufficient representation power, yet remain underexplored in remote sensing super-resolution (SR) tasks. In this paper, we first observe that spiking signals exhibit drastic intensity variations across diverse textures, highlighting an active learning state of the neurons. This observation motivates us to apply SNNs for efficient SR of RSIs. Inspired by the success of attention mechanisms in representing salient information, we devise the spiking attention block (SAB), a concise yet effective component that optimizes membrane potentials through inferred attention weights, which, in turn, regulates spiking activity for superior feature representation. Our key contributions include: 1) we bridge the independent modulation between temporal and channel dimensions, facilitating joint feature correlation learning, and 2) we access the global self-similar patterns in large-scale remote sensing imagery to infer spatial attention weights, incorporating effective priors for realistic and faithful reconstruction. Building upon SAB, we proposed SpikeSR, which achieves state-of-the-art performance across various remote sensing benchmarks such as AID, DOTA, and DIOR, while maintaining high computational efficiency. The code of SpikeSR will be available upon paper acceptance.
Problem

Research questions and friction points this paper is trying to address.

Enhance remote sensing image super-resolution using spiking neural networks.
Address limited capacity and representation power in SNNs for SR tasks.
Develop SpikeSR with spiking attention block for efficient, realistic reconstruction.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Spiking Attention Block optimizes membrane potentials
Joint feature correlation learning via temporal-channel modulation
Global self-similar patterns infer spatial attention weights
🔎 Similar Papers
No similar papers found.