Best practices to evaluate the impact of biomedical research software—metric collection beyond citations

Author:

Afiaz Awan12ORCID,Ivanov Andrey A3ORCID,Chamberlin John4,Hanauer David5,Savonen Candace L2,Goldman Mary J6,Morgan Martin7,Reich Michael8,Getka Alexander9,Holmes Aaron10111213,Pati Sarthak91415ORCID,Knight Dan10111213,Boutros Paul C10111213ORCID,Bakas Spyridon91415,Caporaso J Gregory16,Del Fiol Guilherme4ORCID,Hochheiser Harry17ORCID,Haas Brian18,Schloss Patrick D19,Eddy James A20,Albrecht Jake20,Fedorov Andrey21,Waldron Levi22ORCID,Hoffman Ava M2ORCID,Bradshaw Richard L4ORCID,Leek Jeffrey T2,Wright Carrie2ORCID

Affiliation:

1. Department of Biostatistics, University of Washington , Seattle, WA, 98195, United States

2. Biostatistics Program, Public Health Sciences Division, Fred Hutchinson Cancer Center , Seattle, WA, 98109, United States

3. Department of Pharmacology and Chemical Biology, Emory University School of Medicine, Emory University , Atlanta , GA, 30322, United States

4. Department of Biomedical Informatics, University of Utah , Salt Lake City, UT, 84108, United States

5. Department of Learning Health Sciences, University of Michigan Medical School , Ann Arbor, MI, 48109, United States

6. UC Santa Cruz Genomics Institute, University of California Santa Cruz , Santa Cruz, CA, 95060, United States

7. Roswell Park Comprehensive Cancer Center , Buffalo, NY, 14263, United States

8. University of California , San Diego, La Jolla, CA, 92093, United States

9. University of Pennsylvania , Philadelphia, PA, 19104, United States

10. Jonsson Comprehensive Cancer Center, University of California , Los Angeles, CA, 90095, United States

11. Institute for Precision Health, University of California , Los Angeles, CA, 90095, United States

12. Department of Human Genetics, University of California , Los Angeles, CA, 90095, United States

13. Department of Urology, University of California , Los Angeles, CA, 90095, United States

14. Division of Computational Pathology, Department of Pathology and Laboratory Medicine, Indiana University School of Medicine , Indianapolis, IN, 46202, United States

15. Center for Federated Learning, Indiana University School of Medicine , Indianapolis, IN, 46202, United States

16. Pathogen and Microbiome Institute, Northern Arizona University , Flagstaff, AZ, 86011, United States

17. Department of Biomedical Informatics, University of Pittsburgh , Pittsburgh, PA, 15206, United States

18. Methods Development Laboratory, Broad Institute , Cambridge, MA, 02141, United States

19. Department of Microbiology and Immunology, University of Michigan , Ann Arbor, MI, 48109, United States

20. Sage Bionetworks , Seattle, WA, 98121, United States

21. Department of Radiology, Brigham and Women’s Hospital, Harvard Medical School , Boston, MA, 02138, United States

22. Department of Epidemiology and Biostatistics, City University of New York Graduate School of Public Health and Health Policy , New York, NY, 10027, United States

Abstract

Abstract Motivation Software is vital for the advancement of biology and medicine. Impact evaluations of scientific software have primarily emphasized traditional citation metrics of associated papers, despite these metrics inadequately capturing the dynamic picture of impact and despite challenges with improper citation. Results To understand how software developers evaluate their tools, we conducted a survey of participants in the Informatics Technology for Cancer Research (ITCR) program funded by the National Cancer Institute (NCI). We found that although developers realize the value of more extensive metric collection, they find a lack of funding and time hindering. We also investigated software among this community for how often infrastructure that supports more nontraditional metrics were implemented and how this impacted rates of papers describing usage of the software. We found that infrastructure such as social media presence, more in-depth documentation, the presence of software health metrics, and clear information on how to contact developers seemed to be associated with increased mention rates. Analysing more diverse metrics can enable developers to better understand user engagement, justify continued funding, identify novel use cases, pinpoint improvement areas, and ultimately amplify their software’s impact. Challenges are associated, including distorted or misleading metrics, as well as ethical and security concerns. More attention to nuances involved in capturing impact across the spectrum of biomedical software is needed. For funders and developers, we outline guidance based on experience from our community. By considering how we evaluate software, we can empower developers to create tools that more effectively accelerate biological and medical research progress. Availability and implementation More information about the analysis, as well as access to data and code is available at https://github.com/fhdsl/ITCR_Metrics_manuscript_website.

Funder

National Cancer Institute

National Institutes of Health

Publisher

Oxford University Press (OUP)

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3