A partial convolution generative adversarial network for lesion synthesis and enhanced liver tumor segmentation

Author:

Liu Yingao1ORCID,Yang Fei2,Yang Yidong34

Affiliation:

1. Department of Engineering and Applied Physics University of Science and Technology of China Hefei Anhui China

2. Department of Radiation Oncology University of Miami School of Medicine Miami Florida USA

3. Department of Radiation Oncology the First Affiliated Hospital of USTC Division of Life Sciences and Medicine University of Science and Technology of China Hefei Anhui China

4. School of Physical Sciences & the Ion Medical Research Institute University of Science and Technology of China Hefei Anhui China

Abstract

AbstractLesion segmentation is critical for clinicians to accurately stage the disease and determine treatment strategy. Deep learning based automatic segmentation can improve both the segmentation efficiency and accuracy. However, training a robust deep learning segmentation model requires sufficient training examples with sufficient diversity in lesion location and lesion size. This study is to develop a deep learning framework for generation of synthetic lesions with various locations and sizes that can be included in the training dataset to enhance the lesion segmentation performance. The lesion synthesis network is a modified generative adversarial network (GAN). Specifically, we innovated a partial convolution strategy to construct a U‐Net‐like generator. The discriminator is designed using Wasserstein GAN with gradient penalty and spectral normalization. A mask generation method based on principal component analysis (PCA) was developed to model various lesion shapes. The generated masks are then converted into liver lesions through a lesion synthesis network. The lesion synthesis framework was evaluated for lesion textures, and the synthetic lesions were used to train a lesion segmentation network to further validate the effectiveness of the lesion synthesis framework. All the networks are trained and tested on the LITS public dataset. Our experiments demonstrate that the synthetic lesions generated by our approach have very similar distributions for the two parameters, GLCM‐energy and GLCM‐correlation. Including the synthetic lesions in the segmentation network improved the segmentation dice performance from 67.3% to 71.4%. Meanwhile, the precision and sensitivity for lesion segmentation were improved from 74.6% to 76.0% and 66.1% to 70.9%, respectively. The proposed lesion synthesis approach outperforms the other two existing approaches. Including the synthetic lesion data into the training dataset significantly improves the segmentation performance.

Funder

Fundamental Research Funds for the Central Universities

Publisher

Wiley

Subject

Radiology, Nuclear Medicine and imaging,Instrumentation,Radiation

Cited by 5 articles. 订阅此论文施引文献 订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献

1. Light&fast generative adversarial network for high-fidelity CT image synthesis of liver tumor;Computer Methods and Programs in Biomedicine;2024-09

2. Liver tumor segmentation using G-Unet and the impact of preprocessing and postprocessing methods;Multimedia Tools and Applications;2024-03-07

3. Liver Tumor Segmentation using Hybrid Residual Network and Conditional Random Fields;2023 International Conference on the Confluence of Advancements in Robotics, Vision and Interdisciplinary Technology Management (IC-RVITM);2023-11-28

4. Medical Image Processing based on Generative Adversarial Networks: A Systematic Review;Current Medical Imaging Reviews;2023-10-23

5. Object Detection of Coal Mine Substation Switch Cabinets Based on YOLOv5n_Faster;Modeling and Simulation;2023

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3