Affiliation:
1. School of Computing, University of Derby, Derby DE3 16B, UK
2. School of Computing, The University of Buckingham, Buckingham MK18 1EG, UK
Abstract
In recent years, deep convolutional neural networks (DCNNs) have shown promising performance in medical image analysis, including breast lesion classification in 2D ultrasound (US) images. Despite the outstanding performance of DCNN solutions, explaining their decisions remains an open investigation. Yet, the explainability of DCNN models has become essential for healthcare systems to accept and trust the models. This paper presents a novel framework for explaining DCNN classification decisions of lesions in ultrasound images using the saliency maps linking the DCNN decisions to known cancer characteristics in the medical domain. The proposed framework consists of three main phases. First, DCNN models for classification in ultrasound images are built. Next, selected methods for visualization are applied to obtain saliency maps on the input images of the DCNN models. In the final phase, the visualization outputs and domain-known cancer characteristics are mapped. The paper then demonstrates the use of the framework for breast lesion classification from ultrasound images. We first follow the transfer learning approach and build two DCNN models. We then analyze the visualization outputs of the trained DCNN models using the EGrad-CAM and Ablation-CAM methods. We map the DCNN model decisions of benign and malignant lesions through the visualization outputs to the characteristics such as echogenicity, calcification, shape, and margin. A retrospective dataset of 1298 US images collected from different hospitals is used to evaluate the effectiveness of the framework. The test results show that these characteristics contribute differently to the benign and malignant lesions’ decisions. Our study provides the foundation for other researchers to explain the DCNN classification decisions of other cancer types.
Reference30 articles.
1. Global cancer statistics 2018: GLOBOCAN estimates of incidence and mortality worldwide for 36 cancers in 185 countries;Bray;CA Cancer J. Clin.,2018
2. A generic deep learning framework to classify thyroid and breast lesions in ultrasound images;Zhu;Ultrasonics,2021
3. Interobserver variability of sonographic features used in the American College of Radiology Thyroid Imaging Reporting and Data System;Hoang;Am. J. Roentgenol.,2018
4. Bi-rads update;Mercado;Radiol. Clin. N. Am.,2014
5. Luo, L., Wang, X., Lin, Y., Ma, X., Tan, A., Chan, R., Vardhanabhuti, V., Chu, W.C., Cheng, K.-T., and Chen, H. (2024). Deep learning in breast cancer imaging: A decade of progress and future directions. IEEE Rev. Biomed. Eng., 1–20.
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献