Affiliation:
1. School of Integrated Circuits, Laboratory for Optoelectronics, Huazhong University of Science and Technology, Wuhan, Hubei, China
2. Department of Radiation Oncology, Hubei Cancer Hospital, TongJi Medical College, Huazhong University of Science and Technology, Wuhan, Hubei, China
Abstract
Introduction: Currently, the incidence of liver cancer is on the rise annually. Precise identification of liver tumors is crucial for clinicians to strategize the treatment and combat liver cancer. Thus far, liver tumor contours have been derived through labor-intensive and subjective manual labeling. Computers have gained widespread application in the realm of liver tumor segmentation. Nonetheless, liver tumor segmentation remains a formidable challenge owing to the diverse range of volumes, shapes, and image intensities encountered. Methods: In this article, we introduce an innovative solution called the attention connect network (AC-Net) designed for automated liver tumor segmentation. Building upon the U-shaped network architecture, our approach incorporates 2 critical attention modules: the axial attention module (AAM) and the vision transformer module (VTM), which replace conventional skip-connections to seamlessly integrate spatial features. The AAM facilitates feature fusion by computing axial attention across feature maps, while the VTM operates on the lowest resolution feature maps, employing multihead self-attention, and reshaping the output into a feature map for subsequent concatenation. Furthermore, we employ a specialized loss function tailored to our approach. Our methodology begins with pretraining AC-Net using the LiTS2017 dataset and subsequently fine-tunes it using computed tomography (CT) and magnetic resonance imaging (MRI) data sourced from Hubei Cancer Hospital. Results: The performance metrics for AC-Net on CT data are as follows: dice similarity coefficient (DSC) of 0.90, Jaccard coefficient (JC) of 0.82, recall of 0.92, average symmetric surface distance (ASSD) of 4.59, Hausdorff distance (HD) of 11.96, and precision of 0.89. For AC-Net on MRI data, the metrics are DSC of 0.80, JC of 0.70, recall of 0.82, ASSD of 7.58, HD of 30.26, and precision of 0.84. Conclusion: The comparative experiments highlight that AC-Net exhibits exceptional tumor recognition accuracy when tested on the Hubei Cancer Hospital dataset, demonstrating highly competitive performance for practical clinical applications. Furthermore, the ablation experiments provide conclusive evidence of the efficacy of each module proposed in this article. For those interested, the code for this research article can be accessed at the following GitHub repository: https://github.com/killian-zero/py_tumor-segmentation.git .
Funder
the Health Commission of Hubei Province scientific research project
National Natural Science Foundation of China
the Shenzhen Basic Science Research
the Natural Science Foundation of Hubei Province
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献