Abstract
Due to the unique imaging mechanism of synthetic aperture radar (SAR), which leads to a discrete state of aircraft targets in images, its detection performance is vulnerable to the influence of complex ground objects. Although existing deep learning detection algorithms show good performance, they generally use a feature pyramid neck design and large backbone network, which reduces the detection efficiency to some extent. To address these problems, we propose a simple and efficient attention network (SEAN) in this paper, which takes YOLOv5s as the baseline. First, we shallow the depth of the backbone network and introduce a structural re-parameterization technique to increase the feature extraction capability of the backbone. Second, the neck architecture is designed by using a residual dilated module (RDM), a low-level semantic enhancement module (LSEM), and a localization attention module (LAM), substantially reducing the number of parameters and computation of the network. The results on the Gaofen-3 aircraft target dataset show that this method achieves 97.7% AP at a speed of 83.3 FPS on a Tesla M60, exceeding YOLOv5s by 1.3% AP and 8.7 FPS with 40.51% of the parameters and 86.25% of the FLOPs.
Funder
the Central University Basic Scientific Research Project Of China
Subject
General Earth and Planetary Sciences
Reference60 articles.
1. Polarimetric Radar Imaging: From Basics to Applications;Lee,2017
2. Research progress on aircraft detection and recognition in SAR imagery;Guo;J. Radars,2020
3. Adaptive detection mode with threshold control as a function of spatially sampled clutter level estimates;Finn;RCA Rev.,1968
4. An Adaptive and Fast CFAR Algorithm Based on Automatic Censoring for Target Detection in High-Resolution SAR Images
5. Aircraft target detection algorithm based on high resolution spaceborne SAR imagery;Zhang;Proceedings of the MIPPR 2017: Remote Sensing Image Processing, Geographic Information Systems, and Other Applications,2017
Cited by
3 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献