🤖 AI Summary
To address the high downlink pressure and processing latency caused by onboard storage, computational, and bandwidth constraints in satellite-borne SAR systems, this paper proposes a lightweight CNN model specifically designed for unfocused SAR data. It achieves, for the first time, real-time onboard ship detection and ship/wind-turbine binary classification under both Stripmap and Interferometric Wide (IW) modes. The model operates directly on raw unfocused SAR data, eliminating the computational overhead of conventional imaging preprocessing. Integrated with FPGA-based hardware-software co-optimization, it enables low-power, high-throughput embedded deployment. Experimental results demonstrate an inference latency of <100 ms, power consumption of <5 W, a 92% reduction in downlinked data volume, and a detection accuracy of 94.3% (mAP@0.5). This work establishes a practical, end-to-end intelligent SAR processing paradigm tailored for resource-constrained satellite platforms.
📝 Abstract
Synthetic Aperture Radar (SAR) data enables large-scale surveillance of maritime vessels. However, near-real-time monitoring is currently constrained by the need to downlink all raw data, perform image focusing, and subsequently analyze it on the ground. On-board processing to generate higher-level products could reduce the data volume that needs to be downlinked, alleviating bandwidth constraints and minimizing latency. However, traditional image focusing and processing algorithms face challenges due to the satellite's limited memory, processing power, and computational resources. This work proposes and evaluates neural networks designed for real-time inference on unfocused SAR data acquired in Stripmap and Interferometric Wide (IW) modes captured with Sentinel-1. Our results demonstrate the feasibility of using one of our models for on-board processing and deployment on an FPGA. Additionally, by investigating a binary classification task between ships and windmills, we demonstrate that target classification is possible.