Affiliation:
1. College of Information Engineering, Sichuan Agricultural University, Ya’an 625000, China
2. Agricultural Information Engineering Higher Institution Key Laboratory of Sichuan Province, Ya’an 625000, China
3. Ya’an Digital Agricultural Engineering Technology Research Center, Ya’an 625000, China
Abstract
Monitoring ships on water surfaces encounters obstacles such as weather conditions, sunlight, and water ripples, posing significant challenges in accurately detecting target ships in real time. Synthetic Aperture Radar (SAR) offers a viable solution for real-time ship detection, unaffected by cloud coverage, precipitation, or light levels. However, SAR images are often affected by speckle noise, salt-and-pepper noise, and water surface ripple interference. This study introduces LCAS-DetNet, a Multi-Location Cross-Attention Ship Detection Network tailored for the ships in SAR images. Modeled on the YOLO architecture, LCAS-DetNet comprises a feature extractor, an intermediate layer (“Neck”), and a detection head. The feature extractor includes the computation of Multi-Location Cross-Attention (MLCA) for precise extraction of ship features at multiple scales. Incorporating both local and global branches, MLCA bolsters the network’s ability to discern spatial arrangements and identify targets via a cross-attention mechanism. Each branch utilizes Multi-Location Attention (MLA) and calculates pixel-level correlations in both channel and spatial dimensions, further combating the impact of salt-and-pepper noise on the distribution of objective ship pixels. The feature extractor integrates downsampling and MLCA stacking, enhanced with residual connections and Patch Embedding, to improve the network’s multi-scale spatial recognition capabilities. As the network deepens, we consider this structure to be cascaded and multi-scale, providing the network with a richer receptive field. Additionally, we introduce a loss function based on Wise-IoUv3 to address the influence of label quality on the gradient updates. The effectiveness of our network was validated on the HRSID and SSDD datasets, where it achieved state-of-the-art performance: a 96.59% precision on HRSID and 97.52% on SSDD.
Reference46 articles.
1. Xu, X., Zhang, X., and Zhang, T. (2022). Lite-yolov5: A lightweight deep learning detector for on-board ship detection in large-scene sentinel-1 sar images. Remote Sens., 14.
2. Vehicle detection based on semantic-context enhancement for high-resolution sar images in complex background;Zou;IEEE Geosci. Remote Sens. Lett.,2021
3. Ship detection for high-resolution sar images based on feature analysis;Wang;IEEE Geosci. Remote Sens. Lett.,2013
4. Sar remote sensing of buried faults: Implications for groundwater exploration in the western desert of egypt;Gaber;Sens. Imaging Int. J.,2011
5. Dual polarimetric radar vegetation index for crop growth monitoring using sentinel-1 sar data;Mandal;Remote Sens. Environ.,2020