Abstract
Abstract
Background
The quality of patient medical records is intrinsically related to patient safety, clinical decision-making, communication between health providers, and continuity of care. Additionally, its data are widely used in observational studies. However, the reliability of the information extracted from the records is a matter of concern in audit processes to ensure inter-rater agreement (IRA). Thus, the objective of this study is to evaluate the IRA among members of the Patient Health Record Review Board (PHRRB) in routine auditing of medical records, and the impact of periodic discussions of results with raters.
Methods
A prospective longitudinal study was conducted between July of 2015 and April of 2016 at Hospital Municipal Dr. Moysés Deutsch, a large public hospital in São Paulo. The PHRRB was composed of 12 physicians, 9 nurses, and 3 physiotherapists who audited medical records monthly, with the number of raters changing throughout the study. PHRRB meetings were held to reach a consensus on rating criteria that the members use in the auditing process. A review chart was created for raters to verify the registry of the patient’s secondary diagnosis, chief complaint, history of presenting complaint, past medical history, medication history, physical exam, and diagnostic testing. The IRA was obtained every three months. The Gwet’s AC1 coefficient and Proportion of Agreement (PA) were calculated to evaluate the IRA for each item over time.
Results
The study included 1884 items from 239 records with an overall full agreement among raters of 71.2%. A significant IRA increase of 16.5% (OR = 1.17; 95% CI = 1.03—1.32; p = 0.014) was found in the routine PHRRB auditing, with no significant differences between the PA and the Gwet’s AC1, which showed a similar evolution over time. The PA decreased by 27.1% when at least one of the raters was absent from the review meeting (OR = 0.73; 95% CI = 0.53—1.00; p = 0.048).
Conclusions
Medical record quality has been associated with the quality of care and could be optimized and improved by targeted interventions. The PA and the Gwet’s AC1 are suitable agreement coefficients that are feasible to be incorporated in the routine PHRRB evaluation process.
Funder
Fundação de Amparo à Pesquisa do Estado de São Paulo
Publisher
Springer Science and Business Media LLC
Reference25 articles.
1. Pirkle CM, Dumont A, Zunzunegui M-V. Medical recordkeeping, essential but overlooked aspect of quality of care in resource-limited settings. Int J Qual Health Care. 2012;24(6):564–7. https://doi.org/10.1093/intqhc/mzs034.
2. Zegers M, de Bruijne MC, Spreeuwenberg P, Wagner C, Groenewegen PP, van der Wal G. Quality of patient record keeping: an indicator of the quality of care? BMJ Quality Safety. 2011;20(4):314–8. https://doi.org/10.1136/bmjqs.2009.038976.
3. Conselho Federal de Medicina. Resolução n° 1638. Diário Oficial União n° 153, seção 1, 09/08/2002, p. 184–5. Available: https://sistemas.cfm.org.br/normas/visualizar/resolucoes/BR/2002/1638 [Accessed 30 Dec 2019].
4. Gisev N, Bell JS, Chen TF. Interrater agreement and interrater reliability: Key concepts, approaches, and applications. Res Soc Adm Pharm. 2013;9:330–8. https://doi.org/10.1016/j.sapharm.2012.04.004.
5. Bajpai S, Bajpai R, Chaturvedi HK. Evaluation of inter-rater agreement and inter-rater reliability for observational data: an overview of concepts and methods. J Indian Academy Applied Psychol. 2015;41(3):20–7.
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献