Evaluation of the EsteR Toolkit for COVID-19 Decision Support: Sensitivity Analysis and Usability Study

Author:

Alpers RiekeORCID,Kühne LisaORCID,Truong Hong-PhucORCID,Zeeb HajoORCID,Westphal MaxORCID,Jäckle SonjaORCID

Abstract

Background During the COVID-19 pandemic, local health authorities were responsible for managing and reporting current cases in Germany. Since March 2020, employees had to contain the spread of COVID-19 by monitoring and contacting infected persons as well as tracing their contacts. In the EsteR project, we implemented existing and newly developed statistical models as decision support tools to assist in the work of the local health authorities. Objective The main goal of this study was to validate the EsteR toolkit in two complementary ways: first, investigating the stability of the answers provided by our statistical tools regarding model parameters in the back end and, second, evaluating the usability and applicability of our web application in the front end by test users. Methods For model stability assessment, a sensitivity analysis was carried out for all 5 developed statistical models. The default parameters of our models as well as the test ranges of the model parameters were based on a previous literature review on COVID-19 properties. The obtained answers resulting from different parameters were compared using dissimilarity metrics and visualized using contour plots. In addition, the parameter ranges of general model stability were identified. For the usability evaluation of the web application, cognitive walk-throughs and focus group interviews were conducted with 6 containment scouts located at 2 different local health authorities. They were first asked to complete small tasks with the tools and then express their general impressions of the web application. Results The simulation results showed that some statistical models were more sensitive to changes in their parameters than others. For each of the single-person use cases, we determined an area where the respective model could be rated as stable. In contrast, the results of the group use cases highly depended on the user inputs, and thus, no area of parameters with general model stability could be identified. We have also provided a detailed simulation report of the sensitivity analysis. In the user evaluation, the cognitive walk-throughs and focus group interviews revealed that the user interface needed to be simplified and more information was necessary as guidance. In general, the testers rated the web application as helpful, especially for new employees. Conclusions This evaluation study allowed us to refine the EsteR toolkit. Using the sensitivity analysis, we identified suitable model parameters and analyzed how stable the statistical models were in terms of changes in their parameters. Furthermore, the front end of the web application was improved with the results of the conducted cognitive walk-throughs and focus group interviews regarding its user-friendliness.

Publisher

JMIR Publications Inc.

Subject

Health Informatics,Medicine (miscellaneous)

Cited by 2 articles. 订阅此论文施引文献 订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3