Global Semantic-Sense Aggregation Network for Salient Object Detection in Remote Sensing Images

Author:

Li Hongli12,Chen Xuhui12,Yang Wei3ORCID,Huang Jian3,Sun Kaimin4ORCID,Wang Ying3,Huang Andong5,Mei Liye56ORCID

Affiliation:

1. School of Computer Science and Engineering, Wuhan Institute of Technology, Wuhan 430205, China

2. Hubei Key Laboratory of Intelligent Robot, Wuhan Institute of Technology, Wuhan 430205, China

3. School of Information Science and Engineering, Wuchang Shouyi University, Wuhan 430064, China

4. State Key Laboratory of Information Engineering in Surveying, Mapping and Remote Sensing, Wuhan University, Wuhan 430079, China

5. School of Computer Science, Hubei University of Technology, Wuhan 430068, China

6. The Institute of Technological Sciences, Wuhan University, Wuhan 430072, China

Abstract

Salient object detection (SOD) aims to accurately identify significant geographical objects in remote sensing images (RSI), providing reliable support and guidance for extensive geographical information analyses and decisions. However, SOD in RSI faces numerous challenges, including shadow interference, inter-class feature confusion, as well as unclear target edge contours. Therefore, we designed an effective Global Semantic-aware Aggregation Network (GSANet) to aggregate salient information in RSI. GSANet computes the information entropy of different regions, prioritizing areas with high information entropy as potential target regions, thereby achieving precise localization and semantic understanding of salient objects in remote sensing imagery. Specifically, we proposed a Semantic Detail Embedding Module (SDEM), which explores the potential connections among multi-level features, adaptively fusing shallow texture details with deep semantic features, efficiently aggregating the information entropy of salient regions, enhancing information content of salient targets. Additionally, we proposed a Semantic Perception Fusion Module (SPFM) to analyze map relationships between contextual information and local details, enhancing the perceptual capability for salient objects while suppressing irrelevant information entropy, thereby addressing the semantic dilution issue of salient objects during the up-sampling process. The experimental results on two publicly available datasets, ORSSD and EORSSD, demonstrated the outstanding performance of our method. The method achieved 93.91% Sα, 98.36% Eξ, and 89.37% Fβ on the EORSSD dataset.

Funder

Open Research Fund Program of LIESMARS

Hubei Key Laboratory of Intelligent Robot (Wuhan Institute of Technology) of China

Hubei Province Young Science and Technology Talent Morning Hight Lift Project

Natural Science Foundation of Hubei Province

University Student Innovation and Entrepreneurship Training Program Project

Doctoral Starting Up Foundation of Hubei University of Technology

Publisher

MDPI AG

Reference48 articles.

1. CRNet: Channel-Enhanced Remodeling-Based Network for Salient Object Detection in Optical Remote Sensing Images;Sun;IEEE Trans. Geosci. Remote Sens.,2023

2. ASNet: Adaptive Semantic Network Based on Transformer-CNN for Salient Object Detection in Optical Remote Sensing Images;Yan;IEEE Trans. Geosci. Remote Sens.,2024

3. Edge and Skeleton Guidance Network for Salient Object Detection in Optical Remote Sensing Images;Gong;IEEE Trans. Circuits Syst. Video Technol.,2023

4. ORSI Salient Object Detection via Progressive Semantic Flow and Uncertainty-aware Refinement;Quan;IEEE Trans. Geosci. Remote Sens.,2024

5. Change detection from very-high-spatial-resolution optical remote sensing images: Methods, applications, and future directions;Wen;IEEE Geosci. Remote Sens. Mag.,2021

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3