Towards reliable healthcare Imaging: conditional contrastive generative adversarial network for handling class imbalancing in MR Images

Author:

Cui Lijuan1,Li Dengao1,Yang Xiaofeng2,Liu Chao3

Affiliation:

1. College of Computer Science and Technology (College of Data Science), Taiyuan University of Technology, Taiyuan, Shanxi, China

2. Department of Urology, First Hospital of Shanxi Medical University, Taiyuan, Shanxi, China

3. School of First Hospital of Shanxi Medical University, Taiyuan, Shanxi, China

Abstract

Background Medical imaging datasets frequently encounter a data imbalance issue, where the majority of pixels correspond to healthy regions, and the minority belong to affected regions. This uneven distribution of pixels exacerbates the challenges associated with computer-aided diagnosis. The networks trained with imbalanced data tends to exhibit bias toward majority classes, often demonstrate high precision but low sensitivity. Method We have designed a new network based on adversarial learning namely conditional contrastive generative adversarial network (CCGAN) to tackle the problem of class imbalancing in a highly imbalancing MRI dataset. The proposed model has three new components: (1) class-specific attention, (2) region rebalancing module (RRM) and supervised contrastive-based learning network (SCoLN). The class-specific attention focuses on more discriminative areas of the input representation, capturing more relevant features. The RRM promotes a more balanced distribution of features across various regions of the input representation, ensuring a more equitable segmentation process. The generator of the CCGAN learns pixel-level segmentation by receiving feedback from the SCoLN based on the true negative and true positive maps. This process ensures that final semantic segmentation not only addresses imbalanced data issues but also enhances classification accuracy. Results The proposed model has shown state-of-art-performance on five highly imbalance medical image segmentation datasets. Therefore, the suggested model holds significant potential for application in medical diagnosis, in cases characterized by highly imbalanced data distributions. The CCGAN achieved the highest scores in terms of dice similarity coefficient (DSC) on various datasets: 0.965 ± 0.012 for BUS2017, 0.896 ± 0.091 for DDTI, 0.786 ± 0.046 for LiTS MICCAI 2017, 0.712 ± 1.5 for the ATLAS dataset, and 0.877 ± 1.2 for the BRATS 2015 dataset. DeepLab-V3 follows closely, securing the second-best position with DSC scores of 0.948 ± 0.010 for BUS2017, 0.895 ± 0.014 for DDTI, 0.763 ± 0.044 for LiTS MICCAI 2017, 0.696 ± 1.1 for the ATLAS dataset, and 0.846 ± 1.4 for the BRATS 2015 dataset.

Funder

Central Government Guided Local Science and Technology Development Fund Project

National Natural Science Foundation of China

National Major Scientific Research Instrument Development Project of China

Key Research and Development Projects of Shanxi Province

The Central Guidance on Local Science and Technology Development Fund of Shanxi Province

Publisher

PeerJ

Reference51 articles.

1. An improved algorithm for neural network classification of imbalanced training sets;Anand;IEEE Transactions on Neural Networks,1993

2. Framework for extreme imbalance classification: SWIM—sampling with the majority class;Bellinger;Knowledge and Information Systems,2020

3. Learning from unbalanced data: a cascade-based approach for detecting clustered microcalcifications;Bria;Medical Image Analysis,2014

4. SMOTE: synthetic minority over-sampling technique;Chawla;Journal of Artificial Intelligence Research,2002

5. MRI tumor segmentation with densely connected 3D CNN;Chen,2018

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3