Multi-Guidance CNNs for Salient Object Detection

Author:

Hui Shuaixiong1ORCID,Guo Qiang2ORCID,Geng Xiaoyu2ORCID,Zhang Caiming3ORCID

Affiliation:

1. School of Computer Scienceand Technology, Shandong University of Finance and Economics, and Shandong Provincial Key Laboratory of Digital Media Technology, East Erhuan Road, Jinan, China

2. School of Computer Scienceand Technology, Shandong University of Finance and Economics, and Shandong Provincial Key Laboratory of Digital Media Technology, East Erhuan Road, Jinan

3. School of Software, Shandong University, Shunhua Road, Jinan, China

Abstract

Feature refinement and feature fusion are two key steps in convolutional neural networks–based salient object detection (SOD). In this article, we investigate how to utilize multiple guidance mechanisms to better refine and fuse extracted multi-level features and propose a novel multi-guidance SOD model dubbed as MGuid-Net. Since boundary information is beneficial for locating and sharpening salient objects, edge features are utilized in our network together with saliency features for SOD. Specifically, a self-guidance module is applied to multi-level saliency features and edge features, respectively, which aims to gradually guide the refinement of lower-level features by higher-level features. After that, a cross-guidance module is devised to mutually refine saliency features and edge features via the complementarity between them. Moreover, to better integrate refined multi-level features, we also present an accumulative guidance module, which exploits multiple high-level features to guide the fusion of different features in a hierarchical manner. Finally, a pixelwise contrast loss function is adopted as an implicit guidance to help our network retain more details in salient objects. Extensive experiments on five benchmark datasets demonstrate our model can identify salient regions of an image more effectively compared to most of state-of-the-art models.

Funder

National Natural Science Foundation of China

Natural Science Foundation of Shandong Province for Excellent Young Scholars

Science and Technology Innovation Program for Distinguished Young Scholars of Shandong Province Higher Education Institutions

Publisher

Association for Computing Machinery (ACM)

Subject

Computer Networks and Communications,Hardware and Architecture

Reference60 articles.

1. Frequency-tuned salient region detection

2. Reverse Attention-Based Residual Network for Salient Object Detection

3. Structure-Measure: A New Way to Evaluate Foreground Maps

4. Global Contrast Based Salient Region Detection

5. Ming-Ming Cheng, Jonathan Warrell, Wen-Yan Lin, Shuai Zheng, Vibhav Vineet, and Nigel Crook. 2013. Efficient salient region detection with soft image abstraction. In Proceedings of the IEEE International Conference on Computer Vision. 1529–1536.

Cited by 12 articles. 订阅此论文施引文献 订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献

1. DASOD: Detail-aware salient object detection;Image and Vision Computing;2024-08

2. Rethinking Feature Mining for Light Field Salient Object Detection;ACM Transactions on Multimedia Computing, Communications, and Applications;2024-07-08

3. Gated multi-modal edge refinement network for light field salient object detection;ACM Transactions on Multimedia Computing, Communications, and Applications;2024-06-28

4. SNIPPET: A Framework for Subjective Evaluation of Visual Explanations Applied to DeepFake Detection;ACM Transactions on Multimedia Computing, Communications, and Applications;2024-06-13

5. Progressive Adapting and Pruning: Domain-Incremental Learning for Saliency Prediction;ACM Transactions on Multimedia Computing, Communications, and Applications;2024-06-13

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3