Author:
Lee Seong Hoon,Aw Kah Long,McVerry Ferghal,McCarron Mark O.
Abstract
ObjectiveTo determine the interrater variability for TIA diagnostic agreement among expert clinicians (neurologists/stroke physicians), administrative data, and nonspecialists.MethodsWe performed a meta-analysis of studies from January 1984 to January 2019 using MEDLINE, EMBASE, and PubMed. Two reviewers independently screened for eligible studies and extracted interrater variability measurements using Cohen's kappa scores to assess diagnostic agreement.ResultsNineteen original studies consisting of 19,421 patients were included. Expert clinicians demonstrate good agreement for TIA diagnosis (κ = 0.71, 95% confidence interval [CI] = 0.62–0.81). Interrater variability between clinicians' TIA diagnosis and administrative data also demonstrated good agreement (κ = 0.68, 95% CI = 0.62–0.74). There was moderate agreement (κ = 0.41, 95% CI = 0.22–0.61) between referring clinicians and clinicians at TIA clinics receiving the referrals. Sixty percent of 748 patient referrals to TIA clinics were TIA mimics.ConclusionsOverall agreement between expert clinicians was good for TIA diagnosis, although variation still existed for a sizeable proportion of cases. Diagnostic agreement for TIA decreased among nonspecialists. The substantial number of patients being referred to TIA clinics with other (often neurologic) diagnoses was large, suggesting that clinicians, who are proficient in managing TIAs and their mimics, should run TIA clinics.
Publisher
Ovid Technologies (Wolters Kluwer Health)
Cited by
3 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献