Affiliation:
1. Department of Computer Engineering, Faculty of Technology Selcuk University Konya Turkey
Abstract
AbstractThe classification of medical images enables physicians to perform expeditious and accurate data analysis, increasing the chances of timely disease diagnosis and early intervention to the patient. However, classification is a time‐consuming and labour intensive process when done manually. The Capsule Network (CapsNet) architecture has advantages in accurately and quickly classifying medical images due to its ability to evaluate images within part‐whole relationships, robustness to data rotations and affine transformations, and good performance on small datasets. However, CapsNet may demonstrate low performance on complex datasets. In this study, a new CapsNet model named MResCaps is proposed to overcome this disadvantage and enhance its performance on complex images. MResCaps utilizes an increasing number of residual blocks in each layer in parallel lane to obtain rich feature maps at different levels, aiming to achieve high success in the classification of various medical images. To evaluate the model's performance, the CIFAR10 dataset and the DermaMNIST, PneumoniaMNIST, and OrganMNIST‐S datasets from the MedMNIST dataset collection are used. MResCaps outperformed CapsNet by 20% in terms of accuracy on the CIFAR10 dataset. In addition, AUC values of 96.25%, 96.30%, and 97.12% were achieved in DermaMNIST, PneumoniaMNIST, and OrganMNIST‐S datasets, respectively. The results show that the proposed new model MResCaps improves the performance of CapsNet in the classification of complex and medical images. Furthermore, the model has demonstrated a better performance in comparison with extant studies in the literature. This study aims to contribute significantly to the literature by introducing a novel perspective on CapsNet‐based architectures for the classification of medical images through a parallel‐laned architecture and a rich feature capsule‐focused approach.
Funder
Türkiye Bilimsel ve Teknolojik Araştırma Kurumu