Prostate cancer segmentation from MRI by a multistream fusion encoder

Author:

Jiang Mingjie1,Yuan Baohua12,Kou Weixuan1,Yan Wen13,Marshall Harry4,Yang Qianye3,Syer Tom5,Punwani Shonit5,Emberton Mark6,Barratt Dean C.3,Cho Carmen C. M.7,Hu Yipeng3,Chiu Bernard1

Affiliation:

1. Department of Electrical Engineering City University of Hong Kong Hong Kong SAR China

2. Aliyun School of Big Data Changzhou University Changzhou China

3. Centre for Medical Image Computing Wellcome/EPSRC Centre for Interventional & Surgical Sciences Department of Medical Physics & Biomedical Engineering University College London London UK

4. Schulich School of Medicine & Dentistry Western University Ontario Canada

5. Centre for Medical Imaging University College London London UK

6. Division of Surgery & Interventional Science University College London London UK

7. Prince of Wales Hospital and Department of Imaging and Intervention Radiology Chinese University of Hong Kong Hong Kong SAR China

Abstract

AbstractBackgroundTargeted prostate biopsy guided by multiparametric magnetic resonance imaging (mpMRI) detects more clinically significant lesions than conventional systemic biopsy. Lesion segmentation is required for planning MRI‐targeted biopsies. The requirement for integrating image features available in T2‐weighted and diffusion‐weighted images poses a challenge in prostate lesion segmentation from mpMRI.PurposeA flexible and efficient multistream fusion encoder is proposed in this work to facilitate the multiscale fusion of features from multiple imaging streams. A patch‐based loss function is introduced to improve the accuracy in segmenting small lesions.MethodsThe proposed multistream encoder fuses features extracted in the three imaging streams at each layer of the network, thereby allowing improved feature maps to propagate downstream and benefit segmentation performance. The fusion is achieved through a spatial attention map generated by optimally weighting the contribution of the convolution outputs from each stream. This design provides flexibility for the network to highlight image modalities according to their relative influence on the segmentation performance. The encoder also performs multiscale integration by highlighting the input feature maps (low‐level features) with the spatial attention maps generated from convolution outputs (high‐level features). The Dice similarity coefficient (DSC), serving as a cost function, is less sensitive to incorrect segmentation for small lesions. We address this issue by introducing a patch‐based loss function that provides an average of the DSCs obtained from local image patches. This local average DSC is equally sensitive to large and small lesions, as the patch‐based DSCs associated with small and large lesions have equal weights in this average DSC.ResultsThe framework was evaluated in 931 sets of images acquired in several clinical studies at two centers in Hong Kong and the United Kingdom. In particular, the training, validation, and test sets contain 615, 144, and 172 sets of images, respectively. The proposed framework outperformed single‐stream networks and three recently proposed multistream networks, attaining F1 scores of 82.2 and 87.6% in the lesion and patient levels, respectively. The average inference time for an axial image was 11.8 ms.ConclusionThe accuracy and efficiency afforded by the proposed framework would accelerate the MRI interpretation workflow of MRI‐targeted biopsy and focal therapies.

Funder

Innovation and Technology Commission

Research Grants Council, University Grants Committee

Publisher

Wiley

Subject

General Medicine

Cited by 3 articles. 订阅此论文施引文献 订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3