Improving image quality of sparse-view lung tumor CT images with U-Net

Author:

Ries Annika,Dorosti TinaORCID,Thalhammer Johannes,Sasse Daniel,Sauter Andreas,Meurer Felix,Benne Ashley,Lasser Tobias,Pfeiffer Franz,Schaff Florian,Pfeiffer Daniela

Abstract

Abstract Background We aimed to improve the image quality (IQ) of sparse-view computed tomography (CT) images using a U-Net for lung metastasis detection and determine the best tradeoff between number of views, IQ, and diagnostic confidence. Methods CT images from 41 subjects aged 62.8 ± 10.6 years (mean ± standard deviation, 23 men), 34 with lung metastasis, 7 healthy, were retrospectively selected (2016–2018) and forward projected onto 2,048-view sinograms. Six corresponding sparse-view CT data subsets at varying levels of undersampling were reconstructed from sinograms using filtered backprojection with 16, 32, 64, 128, 256, and 512 views. A dual-frame U-Net was trained and evaluated for each subsampling level on 8,658 images from 22 diseased subjects. A representative image per scan was selected from 19 subjects (12 diseased, 7 healthy) for a single-blinded multireader study. These slices, for all levels of subsampling, with and without U-Net postprocessing, were presented to three readers. IQ and diagnostic confidence were ranked using predefined scales. Subjective nodule segmentation was evaluated using sensitivity and Dice similarity coefficient (DSC); clustered Wilcoxon signed-rank test was used. Results The 64-projection sparse-view images resulted in 0.89 sensitivity and 0.81 DSC, while their counterparts, postprocessed with the U-Net, had improved metrics (0.94 sensitivity and 0.85 DSC) (p = 0.400). Fewer views led to insufficient IQ for diagnosis. For increased views, no substantial discrepancies were noted between sparse-view and postprocessed images. Conclusions Projection views can be reduced from 2,048 to 64 while maintaining IQ and the confidence of the radiologists on a satisfactory level. Relevance statement Our reader study demonstrates the benefit of U-Net postprocessing for regular CT screenings of patients with lung metastasis to increase the IQ and diagnostic confidence while reducing the dose. Key points • Sparse-projection-view streak artifacts reduce the quality and usability of sparse-view CT images. • U-Net-based postprocessing removes sparse-view artifacts while maintaining diagnostically accurate IQ. • Postprocessed sparse-view CTs drastically increase radiologists’ confidence in diagnosing lung metastasis. Graphical Abstract

Funder

Deutsche Forschungsgemeinschaft

Institute for Advanced Study, Technische Universität München

Technische Universität München

Publisher

Springer Science and Business Media LLC

Reference31 articles.

1. World Health Organization (2022) Cancer. Available at: https://www.who.int/news-room/fact-sheets/detail/cancer. Accessed 10 Feb 2023

2. World Cancer Research Fund International (2022) Lung cancer. Available at: https://www.wcrf.org/cancer-trends/lung-cancer-statistics/. Accessed 20 Mar 2023

3. Gesellschaft der epidemiologischen Krebsregister e.V. und Zentrum für Krebsregisterdaten im Robert Koch-Institut (2018) Krebs in Deutschland. Available at: https://www.krebsdaten.de/Krebs/DE/Content/Publikationen/Krebs_in_Deutschland/kid_2021/kid_2021_c33_c34_lunge.pdf?__blob=publicationFile. Accessed 10 Feb 2023

4. American Cancer Society (2023) Lung cancer. Available at: https://www.cancer.org/cancer/lung-cancer.html. Accessed 10 Feb 2023

5. Deutsche Krebsgesellschaft (2013) Lungenkrebs / Lungenkarzinom. Available at: https://www.krebsgesellschaft.de/onko-internetportal/basis-informationen-krebs/krebsarten/lungenkrebs.html. Accessed 10 Feb 2023

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3