Computed tomography and guidelines‐based human–machine fusion model for predicting resectability of the pancreatic cancer

Author:

Yimamu Adilijiang1ORCID,Li Jun2,Zhang Haojie1,Liang Lidu2,Feng Lei1,Wang Yi1,Zhou Chenjie1,Li Shulong2,Gao Yi1ORCID

Affiliation:

1. General Surgery Center, Department of Hepatobiliary Surgery II, Guangdong Provincial Research Center for Artificial Organ and Tissue Engineering, Guangzhou Clinical Research and Transformation Center for Artificial Liver, Institute of Regenerative Medicine, Zhujiang Hospital Southern Medical University Guangzhou China

2. School of Biomedical Engineering, Guangdong Provincial Key Laboratory of Medical Image Processing, Guangdong Province Engineering Laboratory for Medical Imaging and Diagnostic Technology Southern Medical University Guangzhou China

Abstract

AbstractBackground and AimThe study aims to develop a hybrid machine learning model for predicting resectability of the pancreatic cancer, which is based on computed tomography (CT) and National Comprehensive Cancer Network (NCCN) guidelines.MethodWe retrospectively studied 349 patients. One hundred seventy‐one cases from Center 1 and 92 cases from Center 2 were used as the primary training cohort, and 66 cases from Center 3 and 20 cases from Center 4 were used as the independent test dataset.Semi‐automatic module of ITK‐SNAP software was used to assist CT image segmentation to obtain three‐dimensional (3D) imaging region of interest (ROI). There were 788 handcrafted features extracted for 3D ROI using PyRadiomics. The optimal feature subset consists of three features screened by three feature selection methods as the input of the SVM to construct the conventional radiomics‐based predictive model (cRad). 3D ROI was used to unify the resolution by 3D spline interpolation method for constructing the 3D tumor imaging tensor. Using 3D tumor image tensor as input, 3D kernelled support tensor machine‐based predictive model (KSTM), and 3D ResNet‐based deep learning predictive model (ResNet) were constructed. Multi‐classifier fusion ML model is constructed by fusing cRad, KSTM, and ResNet using multi‐classifier fusion strategy. Two experts with more than 10 years of clinical experience were invited to reevaluate each patient based on their CECT following the NCCN guidelines to obtain resectable, unresectable, and borderline resectable diagnoses. The three results were converted into probability values of 0.25, 0.75, and 0.50, respectively, according to the traditional empirical method. Then it is used as an independent classifier and integrated with multi‐classifier fusion machine learning (ML) model to obtain the human–machine fusion ML model (HMfML).ResultsMulti‐classifier fusion ML model's area under receiver operating characteristic curve (AUC; 0.8610), predictive accuracy (ACC: 80.23%), sensitivity (SEN: 78.95%), and specificity (SPE: 80.60%) is better than cRad, KSTM, and ResNet‐based single‐classifier models and their two‐classifier fusion models. This means that three different models have mined complementary CECT feature expression from different perspectives and can be integrated through CFS‐ER, so that the fusion model has better performance. HMfML's AUC (0.8845), ACC (82.56%), SEN (84.21%), SPE (82.09%). This means that ML models might learn extra information from CECT that experts cannot distinguish, thus complementing expert experience and improving the performance of hybrid ML models.ConclusionHMfML can predict PC resectability with high accuracy.

Funder

National Natural Science Foundation of China

Publisher

Wiley

Subject

Gastroenterology,Hepatology

Cited by 1 articles. 订阅此论文施引文献 订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3