Classification of multi‐feature fusion ultrasound images of breast tumor within category 4 using convolutional neural networks

Author:

Xu Pengfei1,Zhao Jing2,Wan Mingxi1,Song Qing3,Su Qiang4,Wang Diya1

Affiliation:

1. Department of Biomedical Engineering Key Laboratory of Biomedical Information Engineering of Ministry of Education School of Life Science and Technology Xi'an Jiaotong University Xi'an China

2. The Second Hospital of Jilin University Changchun China

3. The First Affiliated Hospital of Xi'an Jiaotong University Xi'an China

4. Department of Oncology Beijing Friendship Hospital Capital Medical University Beijing China

Abstract

AbstractBackgroundBreast tumor is a fatal threat to the health of women. Ultrasound (US) is a common and economical method for the diagnosis of breast cancer. Breast imaging reporting and data system (BI‐RADS) category 4 has the highest false‐positive value of about 30% among five categories. The classification task in BI‐RADS category 4 is challenging and has not been fully studied.PurposeThis work aimed to use convolutional neural networks (CNNs) for breast tumor classification using B‐mode images in category 4 to overcome the dependence on operator and artifacts. Additionally, this work intends to take full advantage of morphological and textural features in breast tumor US images to improve classification accuracy.MethodsFirst, original US images coming directly from the hospital were cropped and resized. In 1385 B‐mode US BI‐RADS category 4 images, the biopsy eliminated 503 samples of benign tumor and left 882 of malignant. Then, K‐means clustering algorithm and entropy of sliding windows of US images were conducted. Considering the diversity of different characteristic information of malignant and benign represented by original B‐mode images, K‐means clustering images and entropy images, they are fused in a three‐channel form multi‐feature fusion images dataset. The training, validation, and test sets are 969, 277, and 139. With transfer learning, 11 CNN models including DenseNet and ResNet were investigated. Finally, by comparing accuracy, precision, recall, F1‐score, and area under curve (AUC) of the results, models which had better performance were selected. The normality of data was assessed by Shapiro‐Wilk test. DeLong test and independent t‐test were used to evaluate the significant difference of AUC and other values. False discovery rate was utilized to ultimately evaluate the advantages of CNN with highest evaluation metrics. In addition, the study of anti‐log compression was conducted but no improvement has shown in CNNs classification results.ResultsWith multi‐feature fusion images, DenseNet121 has highest accuracy of 80.22 ± 1.45% compared to other CNNs, precision of 77.97 ± 2.89% and AUC of 0.82 ± 0.01. Multi‐feature fusion improved accuracy of DenseNet121 by 1.87% from classification of original B‐mode images (p < 0.05).ConclusionThe CNNs with multi‐feature fusion show a good potential of reducing the false‐positive rate within category 4. The work illustrated that CNNs and fusion images have the potential to reduce false‐positive rate in breast tumor within US BI‐RADS category 4, and make the diagnosis of category 4 breast tumors to be more accurate and precise.

Funder

National Natural Science Foundation of China

Publisher

Wiley

Cited by 1 articles. 订阅此论文施引文献 订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3