An Explainable AI Platform to Help Healthcare Professionals of Diabetes Understand Predictions Made by Machine-learning Approaches (Preprint)

Author:

Hendawi RashaORCID,Li JuanORCID,Roy SouradipORCID

Abstract

BACKGROUND

Machine learning, especially deep learning has been used for diagnosing and predicting diabetes. Machine learning-based approaches achieve good prediction accuracy and precision. However, machine learning approaches normally work as a black box and the rationale underneath is unknown to physicians and patients, which may cause confusion and distrust issues. This issue hinders machine learning’s application in diabetes and other healthcare practice.

OBJECTIVE

This study aims to help healthcare professionals to understand how AI makes predictions and recommendations for diabetes. Specifically, we design, develop, and evaluate an explainable AI platform that can not only predict diabetes risk but also provide human-comprehensible explanations for complex, black-box machine learning models and prediction results.

METHODS

An explainable AI framework, XAI4Diabetes, is designed with a multi-module explanation framework based on technologies of machine learning, knowledge graphs, and ontologies. XAI4Diabetes consists of four modules: (1) the knowledge base module, (2) the knowledge matching module, (3) the prediction module, and (4) the interpretation module. XAI4Diabetes applies AI techniques to predict diabetes risk and interpret the prediction process and results. A mobile application prototype system was developed. The application was evaluated by usability study and satisfaction surveys.

RESULTS

Results of the evaluation study demonstrate that medical professionals agreed that XAI4Diabetes helps them understand (1) how machine learning makes the diabetes prediction, (2) what datasets were used for the machine learning model, (3) data features used in the dataset, and (4) the prediction results in terms of feature importance. Most participating medical professionals acknowledge that XAI4Diabetes helps them better understand and trust predictions made by AI systems. The satisfaction survey shows that participants were satisfied with the tool in general.

CONCLUSIONS

In this research, we designed, developed, and evaluated a multi-model explainable perdition model, XAI4Diabetes, for diabetes care. XAI4Diabetes provides an easy-to-use interface to predict a patient’s risk of diabetes and explain the prediction process and the results. The experimental results show that the prototype mobile application system can help healthcare professionals’ understanding of the AI decision-making process, thus improving transparency and trust. This would potentially mitigate various kinds of bias and promote the application of AI in diabetes care.

Publisher

JMIR Publications Inc.

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3