Lessons Learned for Identifying and Annotating Permissions in Clinical Consent Forms

Author:

Umberfield Elizabeth E.12,Jiang Yun3,Fenton Susan H.4,Stansbury Cooper56,Ford Kathleen3,Crist Kaycee7,Kardia Sharon L. R.8,Thomer Andrea K.9,Harris Marcelline R.3

Affiliation:

1. Health Policy & Management, Indiana University Richard M Fairbanks School of Public Health, Indianapolis, Indiana, United States

2. Center for Biomedical Informatics, Regenstrief Institute, Inc., Indianapolis, Indiana, United States

3. Department of Systems, Populations and Leadership, University of Michigan School of Nursing, Ann Arbor, Michigan, United States

4. School of Biomedical Informatics, University of Texas Health Science Center, Houston, Texas, United States

5. Department of Computational Medicine and Bioinformatics, University of Michigan Medical School, Ann Arbor, Michigan, United States

6. The Michigan Institute for Computational Discovery and Engineering, University of Michigan, Ann Arbor, Michigan, United States

7. Rory Meyers School of Nursing, New York University, New York, New York, United States

8. Department of Epidemiology, University of Michigan School of Public Health, Ann Arbor, Michigan, United States

9. University of Michigan School of Information, Ann Arbor, Michigan, United States

Abstract

Abstract Background The lack of machine-interpretable representations of consent permissions precludes development of tools that act upon permissions across information ecosystems, at scale. Objectives To report the process, results, and lessons learned while annotating permissions in clinical consent forms. Methods We conducted a retrospective analysis of clinical consent forms. We developed an annotation scheme following the MAMA (Model-Annotate-Model-Annotate) cycle and evaluated interannotator agreement (IAA) using observed agreement (A o), weighted kappa (κw ), and Krippendorff's α. Results The final dataset included 6,399 sentences from 134 clinical consent forms. Complete agreement was achieved for 5,871 sentences, including 211 positively identified and 5,660 negatively identified as permission-sentences across all three annotators (A o = 0.944, Krippendorff's α = 0.599). These values reflect moderate to substantial IAA. Although permission-sentences contain a set of common words and structure, disagreements between annotators are largely explained by lexical variability and ambiguity in sentence meaning. Conclusion Our findings point to the complexity of identifying permission-sentences within the clinical consent forms. We present our results in light of lessons learned, which may serve as a launching point for developing tools for automated permission extraction.

Funder

Robert Wood Johnson Foundation Future of Nursing Scholar's Program

National Library of Medicine

National Human Genome Research Institute

Rackham Graduate Student Research Grant

University of Michigan Institute for Data Science

Publisher

Georg Thieme Verlag KG

Subject

Health Information Management,Computer Science Applications,Health Informatics

Reference12 articles.

1. Electronic health records in Danish home care and nursing homes: inadequate documentation of care, medication, and consent;M Hertzum;Appl Clin Inform,2021

2. An investigation of the efficacy of electronic consenting interfaces of research permissions management system in a hospital setting;K Chalil Madathil;Int J Med Inform,2013

3. Association of electronic surgical consent forms with entry error rates;J J Reeves;JAMA Surg,2020

4. Replacing paper informed consent with electronic informed consent for research in academic medical centers: a scoping review;C Chen;AMIA Jt Summits Transl Sci Proc,2020

5. Engagement shift: informed consent in the digital era;J Litwin;Appl Clin Trials,2016

Cited by 2 articles. 订阅此论文施引文献 订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3