Factors Influencing Clinician Trust in Predictive Clinical Decision Support Systems for In-Hospital Deterioration: Qualitative Descriptive Study

Author:

Schwartz Jessica MORCID,George MaureenORCID,Rossetti Sarah CollinsORCID,Dykes Patricia CORCID,Minshall Simon RORCID,Lucas EugeneORCID,Cato Kenrick DORCID

Abstract

Background Clinician trust in machine learning–based clinical decision support systems (CDSSs) for predicting in-hospital deterioration (a type of predictive CDSS) is essential for adoption. Evidence shows that clinician trust in predictive CDSSs is influenced by perceived understandability and perceived accuracy. Objective The aim of this study was to explore the phenomenon of clinician trust in predictive CDSSs for in-hospital deterioration by confirming and characterizing factors known to influence trust (understandability and accuracy), uncovering and describing other influencing factors, and comparing nurses’ and prescribing providers’ trust in predictive CDSSs. Methods We followed a qualitative descriptive methodology conducting directed deductive and inductive content analysis of interview data. Directed deductive analyses were guided by the human-computer trust conceptual framework. Semistructured interviews were conducted with nurses and prescribing providers (physicians, physician assistants, or nurse practitioners) working with a predictive CDSS at 2 hospitals in Mass General Brigham. Results A total of 17 clinicians were interviewed. Concepts from the human-computer trust conceptual framework—perceived understandability and perceived technical competence (ie, perceived accuracy)—were found to influence clinician trust in predictive CDSSs for in-hospital deterioration. The concordance between clinicians’ impressions of patients’ clinical status and system predictions influenced clinicians’ perceptions of system accuracy. Understandability was influenced by system explanations, both global and local, as well as training. In total, 3 additional themes emerged from the inductive analysis. The first, perceived actionability, captured the variation in clinicians’ desires for predictive CDSSs to recommend a discrete action. The second, evidence, described the importance of both macro- (scientific) and micro- (anecdotal) evidence for fostering trust. The final theme, equitability, described fairness in system predictions. The findings were largely similar between nurses and prescribing providers. Conclusions Although there is a perceived trade-off between machine learning–based CDSS accuracy and understandability, our findings confirm that both are important for fostering clinician trust in predictive CDSSs for in-hospital deterioration. We found that reliance on the predictive CDSS in the clinical workflow may influence clinicians’ requirements for trust. Future research should explore the impact of reliance, the optimal explanation design for enhancing understandability, and the role of perceived actionability in driving trust.

Publisher

JMIR Publications Inc.

Subject

Health Informatics,Human Factors and Ergonomics

Reference42 articles.

1. Clinical Decision Support: a 25 Year Retrospective and a 25 Year Vision

2. TonekaboniSJoshiSMcCraddenMGoldenbergAWhat clinicians want: contextualizing explainable machine learning for clinical end useProceedings of Machine Learning Research2019Machine Learning for HealthcareAug 8-10, 2019Ann Arbor, MI

3. Designing Theory-Driven User-Centric Explainable AI

4. Article 98 Review of other Union legal acts on data protection

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3