Author:
Zhang Zhan,Zhang Qin,Jiao Yang,Lu Lin,Ma Lin,Liu Aihua,Liu Xiao,Zhao Juan,Xue Yajun,Wei Bing,Zhang Mingxia,Gao Ru,Zhao Hong,Lu Jie,Li Fan,Zhang Yang,Wang Yiming,Zhang Lei,Tian Fengwei,Hu Jie,Gou Xin
Abstract
AbstractAI-aided clinical diagnosis is desired in medical care. Existing deep learning models lack explainability and mainly focus on image analysis. The recently developed Dynamic Uncertain Causality Graph (DUCG) approach is causality-driven, explainable, and invariant across different application scenarios, without problems of data collection, labeling, fitting, privacy, bias, generalization, high cost and high energy consumption. Through close collaboration between clinical experts and DUCG technicians, 46 DUCG models covering 54 chief complaints were constructed. Over 1,000 diseases can be diagnosed without triage. Before being applied in real-world, the 46 DUCG models were retrospectively verified by third-party hospitals. The verified diagnostic precisions were no less than 95%, in which the diagnostic precision for every disease including uncommon ones was no less than 80%. After verifications, the 46 DUCG models were applied in the real-world in China. Over one million real diagnosis cases have been performed, with only 17 incorrect diagnoses identified. Due to DUCG’s transparency, the mistakes causing the incorrect diagnoses were found and corrected. The diagnostic abilities of the clinicians who applied DUCG frequently were improved significantly. Following the introduction to the earlier presented DUCG methodology, the recommendation algorithm for potential medical checks is presented and the key idea of DUCG is extracted.
Funder
Institute for Guo Qiang, Tsinghua University
Beijing Yutong Intelligence Technology Co., Ltd., Beijing, China
National High Level Hospital Clinical Research Funding
Chongqing Science and Technology Bureau
Publisher
Springer Science and Business Media LLC