Smartphone apps for point-of-care information summaries: systematic assessment of the quality and content

Author:

Lee MauricetteORCID,Lin Xiaowen,Chai Joanne Zhi Qi,Lee Eng Sing,Smith Helen,Tudor Car LorainneORCID

Abstract

BackgroundClinicians need easy access to evidence-based information to inform their clinical practice. Point-of-care information summaries are increasingly available in the form of smartphone apps. However, the quality of information from the apps is questionable as there is currently no regulation on the content of the medical apps.ObjectivesThis study aimed to systematically assess the quality and content of the medical apps providing point-of-care information summaries that were available in two major app stores. We evaluated apps designed specifically for healthcare professionals and assessed their content development, editorial policy, coverage of medical conditions and trustworthiness.MethodsWe conducted a systematic assessment of medical apps providing point-of-care information summaries available in Google Play and Apple app stores. Apps launched or updated since January 2020 were identified through a systematic search using 42matters. Apps meeting the inclusion criteria were downloaded and assessed. The data extraction and app assessment were done in parallel and independently by at least two reviewers. Apps were evaluated against the adapted criteria: (1) general characteristics, (2) content presentation of the summaries, (3) editorial quality, (4) evidence-based methodology, (5) coverage (volume) of the medical conditions, (6) usability of apps and (7) trustworthiness of the app based on HONcode principles. HONcode principles are guidelines used to inform users about the credibility and reliability of health information online. The results were reported as a narrative review.ResultsEight medical apps met the inclusion criteria and were systematically appraised. Based on our evaluation criteria, UpToDate supported 16 languages, and all other apps were English. Bullet points and brief paragraphs were used in all apps, and only DynaMed and Micromedex and Pathway-medical knowledge provided a formal grading system for the strength of recommendations for all the medical conditions in their apps. All the other apps either lacked a formal grading system altogether or offered one for some of the medical conditions. About 30% of the editorial quality assessment and 47.5% of the evidence-based methodology assessment were unclear or missing. UpToDate contained the most point-of-care evidence-based documents with >10 500 documents. All apps except 5-Minute Clinical Consult and DynaMed and Micromedex were available for offline access. Only Medscape complied with the HONcode principles.ConclusionsFuture apps should report a more detailed evidence-based methodology, be accessible for offline use and support search in more than one language. There should be clearer information provided in future apps regarding the declaration of authorship and conflict of interest.

Publisher

BMJ

Subject

General Medicine

Cited by 2 articles. 订阅此论文施引文献 订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3