Author:
Wang Xingqiao,Xu Xiaowei,Tong Weida,Liu Qi,Liu Zhichao
Abstract
Causality plays an essential role in multiple scientific disciplines, including the social, behavioral, and biological sciences and portions of statistics and artificial intelligence. Manual-based causality assessment from a large number of free text-based documents is very time-consuming, labor-intensive, and sometimes even impractical. Herein, we proposed a general causal inference framework named DeepCausality to empirically estimate the causal factors for suspected endpoints embedded in the free text. The proposed DeepCausality seamlessly incorporates AI-powered language models, named entity recognition and Judea Pearl's Do-calculus, into a general framework for causal inference to fulfill different domain-specific applications. We exemplified the utility of the proposed DeepCausality framework by employing the LiverTox database to estimate idiosyncratic drug-induced liver injury (DILI)-related causal terms and generate a knowledge-based causal tree for idiosyncratic DILI patient stratification. Consequently, the DeepCausality yielded a prediction performance with an accuracy of 0.92 and an F-score of 0.84 for the DILI prediction. Notably, 90% of causal terms enriched by the DeepCausality were consistent with the clinical causal terms defined by the American College of Gastroenterology (ACG) clinical guideline for evaluating suspected idiosyncratic DILI (iDILI). Furthermore, we observed a high concordance of 0.91 between the iDILI severity scores generated by DeepCausality and domain experts. Altogether, the proposed DeepCausality framework could be a promising solution for causality assessment from free text and is publicly available throughhttps://github.com/XingqiaoWang/https-github.com-XingqiaoWang-DeepCausality-LiverTox.
Reference33 articles.
1. SciBERT: A pretrained language model for scientific text;Beltagy;arXiv [Preprint].,2019
2. Language models are few-shot learners;Brown;Adv. Neural Inf. Process. Syst.,2020
3. ACG clinical guideline: diagnosis and management of idiosyncratic drug-induced liver injury;Chalasani;ACG,2021
4. LEGAL-BERT: The muppets straight out of law school;Chalkidis;arXiv [Preprint].,2020
5. Electra: Pre-training text encoders as discriminators rather than generators;Clark;arXiv [Preprint].,2020
Cited by
6 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献