A Novel Hybridoma Cell Segmentation Method Based on Multi-Scale Feature Fusion and Dual Attention Network

Author:

Lu Jianfeng1,Ren Hangpeng1,Shi Mengtao1ORCID,Cui Chen12,Zhang Shanqing1,Emam Mahmoud13ORCID,Li Li1

Affiliation:

1. School of Computer Science and Technology, Hangzhou Dianzi University, Hangzhou 310018, China

2. Key Laboratory of Public Security Information Application Based on Big-Data Architecture, Ministry of Public Security, Zhejiang Police College, Hangzhou 310000, China

3. Faculty of Artificial Intelligence, Menoufia University, Shebin El-Koom 32511, Egypt

Abstract

The hybridoma cell screening method is usually done manually by human eyes during the production process for monoclonal antibody drugs. This traditional screening method has certain limitations, such as low efficiency and subjectivity bias. Furthermore, most of the existing deep learning-based image segmentation methods have certain drawbacks, due to different shapes of hybridoma cells and uneven location distribution. In this paper, we propose a deep hybridoma cell image segmentation method based on residual and attention U-Net (RA-UNet). Firstly, the feature maps of the five modules in the network encoder are used for multi-scale feature fusion in a feature pyramid form and then spliced into the network decoder to enrich the semantic level of the feature maps in the decoder. Secondly, a dual attention mechanism module based on global and channel attention mechanisms is presented. The global attention mechanism (non-local neural network) is connected to the network decoder to expand the receptive field of the feature map and bring more rich information to the network. Then, the channel attention mechanism SENet (the squeeze-and-excitation network) is connected to the non-local attention mechanism. Consequently, the important features are enhanced by the learning of the feature channel weights, and the secondary features are suppressed, hence improving the cell segmentation performance and accuracy. Finally, the focal loss function is used to guide the network to learn the hard-to-classify cell categories. Furthermore, we evaluate the performance of the proposed RA-UNet method on a newly established hybridoma cell image dataset. Experimental results show that the proposed method has good reliability and improves the efficiency of hybridoma cell segmentation compared with state-of-the-art networks such as FCN, UNet, and UNet++. The results show that the proposed RA-UNet model has improvements of 0.8937%, 0.9926%, 0.9512%, and 0.9007% in terms of the dice coefficients, PA, MPA, and MIoU, respectively.

Funder

Public Welfare Technology Research Project of Zhejiang Province

Opening Project of the Key Laboratory of Public Security Information Application Based on Big-Data Architecture, Ministry of Public Security of Zhejiang Police College

National Natural Science Foundation of China

Publisher

MDPI AG

Subject

Electrical and Electronic Engineering,Computer Networks and Communications,Hardware and Architecture,Signal Processing,Control and Systems Engineering

Reference34 articles.

1. Hybridoma technology: Is it still useful?;Moraes;Curr. Res. Immunol.,2021

2. Image Segmentation Using Deep Learning: A Survey;Minaee;IEEE Trans. Pattern Anal. Mach. Intell.,2022

3. Deep neural networks for medical image segmentation;Malhotra;J. Healthc. Eng.,2022

4. Deep learning for cell image segmentation and ranking;Araujo;Comput. Med. Imaging Graph.,2019

5. Al-Kofahi, Y., Zaltsman, A., Graves, R., Marshall, W., and Rusu, M. (2018). A deep learning-based algorithm for 2-D cell segmentation in microscopy images. BMC Bioinform., 19.

Cited by 9 articles. 订阅此论文施引文献 订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3