Assessing the Reproducibility of Research Based on the Food and Drug Administration Manufacturer and User Facility Device Experience Data

Author:

Li Xinyu1,Feng Yubo1,Gong Yang2,Chen You

Affiliation:

1. Department of Computer Science, Vanderbilt University, Nashville, Tennessee

2. School of Biomedical Informatics, The University of Texas Health Science Center at Houston, Houston, Texas

Abstract

Objective This article aims to assess the reproducibility of Manufacturer and User Facility Device Experience (MAUDE) data-driven studies by analyzing the data queries used in their research processes. Methods Studies using MAUDE data were sourced from PubMed by searching for “MAUDE” or “Manufacturer and User Facility Device Experience” in titles or abstracts. We manually chose articles with executable queries. The reproducibility of each query was assessed by replicating it in the MAUDE Application Programming Interface. The reproducibility of a query is determined by a reproducibility coefficient that ranges from 0.95 to 1.05. This coefficient is calculated by comparing the number of medical device reports (MDRs) returned by the reproduced queries to the number of reported MDRs in the original studies. We also computed the reproducibility ratio, which is the fraction of reproducible queries in subgroups divided by the query complexity, the device category, and the presence of a data processing flow. Results As of August 8, 2022, we identified 523 articles from which 336 contained queries, and 60 of these were executable. Among these, 14 queries were reproducible. Queries using a single field like product code, product class, or brand name showed higher reproducibility (50%, 33.3%, 31.3%) compared with other fields (8.3%, P = 0.037). Single-category device queries exhibited a higher reproducibility ratio than multicategory ones, but without statistical significance (27.1% versus 8.3%, P = 0.321). Studies including a data processing flow had a higher reproducibility ratio than those without, although this difference was not statistically significant (42.9% versus 17.4%, P = 0.107). Conclusions Our findings indicate that the reproducibility of queries in MAUDE data-driven studies is limited. Enhancing this requires the development of more effective MAUDE data query strategies and improved application programming interfaces.

Publisher

Ovid Technologies (Wolters Kluwer Health)

Reference338 articles.

1. Analysis: using the FDA MAUDE and medical device recall databases to design better devices;Biomed Instrum Technol,2020

2. Adverse events and complications associated with intrathecal drug delivery systems: insights from the Manufacturer and User Facility Device Experience (MAUDE) database;Neuromodulation,2021

3. Postmarket surveillance and returned product analysis: success but not transparency;Heart Rhythm,2013

4. Conflicting information from the Food and Drug Administration: missed opportunity to lead standards for safe and effective medical artificial intelligence solutions [published correction appears in J Am Med Inform Assoc. 2021 May 5];J Am Med Inform Assoc,2021

5. Trust and transparency in medical device regulation;BMJ,2019

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3