MTAN: A semi-supervised learning model for kidney tumor segmentation

Author:

Sun Peng1,Yang Sijing2,Guan Haolin1,Mo Taiping1,Yu Bonan3,Chen Zhencheng1245

Affiliation:

1. School of Electronic Engineering and Automation Guilin University of Electronic Technology, Guilin, Guangxi, China

2. School of Life and Environmental Science Guilin University of Electronic Technology, Guilin, Guangxi, China

3. School of Architecture and Transportation Engineering Guilin University of Electronic Technology, Guilin, Guangxi, China

4. Guangxi Colleges and Universities Key Laboratory of Biomedical Sensors and Intelligent Instruments, Guilin, Guangxi, China

5. Guangxi Engineering Technology Research Center of Human Physiological Information Noninvasive Detection, Guilin, Guangxi, China

Abstract

BACKGROUND: Medical image segmentation is crucial in disease diagnosis and treatment planning. Deep learning (DL) techniques have shown promise. However, optimizing DL models requires setting numerous parameters, and demands substantial labeled datasets, which are labor-intensive to create. OBJECTIVE: This study proposes a semi-supervised model that can utilize labeled and unlabeled data to accurately segment kidneys, tumors, and cysts on CT images, even with limited labeled samples. METHODS: An end-to-end semi-supervised learning model named MTAN (Mean Teacher Attention N-Net) is designed to segment kidneys, tumors, and cysts on CT images. The MTAN model is built on the foundation of the AN-Net architecture, functioning dually as teachers and students. In its student role, AN-Net learns conventionally. In its teacher role, it generates objects and instructs the student model on their utilization to enhance learning quality. The semi-supervised nature of MTAN allows it to effectively utilize unlabeled data for training, thus improving performance and reducing overfitting. RESULTS: We evaluate the proposed model using two CT image datasets (KiTS19 and KiTS21). In the KiTS19 dataset, MTAN achieved segmentation results with an average Dice score of 0.975 for kidneys and 0.869 for tumors, respectively. Moreover, on the KiTS21 dataset, MTAN demonstrates its robustness, yielding average Dice scores of 0.977 for kidneys, 0.886 for masses, 0.861 for tumors, and 0.759 for cysts, respectively. CONCLUSION: The proposed MTAN model presents a compelling solution for accurate medical image segmentation, particularly in scenarios where the labeled data is scarce. By effectively utilizing the unlabeled data through a semi-supervised learning approach, MTAN mitigates overfitting concerns and achieves high-quality segmentation results. The consistent performance across two distinct datasets, KiTS19 and KiTS21, underscores model’s reliability and potential for clinical reference.

Publisher

IOS Press

Subject

Electrical and Electronic Engineering,Condensed Matter Physics,Radiology, Nuclear Medicine and imaging,Instrumentation,Radiation

Reference11 articles.

1. International variations and trends in renal cell carcinoma incidence and mortality;Znaor;European Urology,2015

2. A review on deep learning in medical image analysis;Suganyadevi;International Journal of Multimedia Information Retrieval,2022

3. Kidney tumor detection and classification based on deep learning approaches: A new dataset in CT scans;Alzu’bi;Journal of Healthcare Engineering,2022

4. The state of the art in kidney and kidney tumor segmentation in contrast-enhanced CT imaging: Results of the KiTS19 challenge;Heller;Medical Image Analysis,2021

5. Mean teachers are better role models: Weight-averaged consistency targets improve semi-supervised deep learning results;Tarvainen;Advances in Neural Information Processing Systems,2017

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3