Classifications for Radiographic Evaluation of Radiolucent Bone Lesions have Poor Inter- and Intra-observer Agreement

Author:

Willenbring Taylor J.1,Papa Sarah M.1,Mann Kenneth A.1,Cavallaro Salvatore1,Damron Timothy A.1

Affiliation:

1. SUNY Upstate Medical University

Abstract

Abstract

Background Radiolucent bone lesions are encountered in all orthopedic specialties, and concise description is essential to inform evaluation and treatment. We studied the interobserver reliability and intra-observer reproducibility of three classification systems of radiographic radiolucent lesions: (1) original Lodwick classification, (2) modified Lodwick classification, and (3) Enneking classification for benign tumors. We hypothesized that intra-observer reproducibility would be good but interobserver reliability would be poor, improving with training level, and highest for the Enneking classification. Methods Forty-eight case sets of de-identified radiographs of radiolucent osseous lesions were selected from an orthopedic oncology practice. Each set included two orthogonal views of the lesion from initial presentation. Twenty participants (one third-year medical student, 18 residents, one orthopedic oncologist) classified each case twice, with a minimum two-week gap between sessions, according to the Lodwick classification, modified Lodwick classification, and Enneking classification. Interobserver reliability and intra-observer reproducibility were calculated using Fleiss’ kappa and Krippendorff’s alpha, treating the classifications as nominal and ordinal rankings, respectively. Linear regression models were used to determine the effect of training level on reproducibility. Contingency tables were used to assess the accuracy of correctly identifying benign versus malignant lesions against their known diagnoses. Results Interobserver reliability was poor, as demonstrated by agreement of 39% (κ = 0.23; α = 0.54), 39% (κ = 0.25; α = 0.48), and 53% (κ = 0.28; α = 0.45) for the Lodwick, modified Lodwick, and Enneking classifications, respectively. Intra-observer reproducibility also lacked strong agreement (κ = 0.42–0.45). Training level had no effect on reproducibility (R2 < 0.2, p > 0.05 for all classifications). Comparison of intra-observer reproducibility showed Krippendorff’s alpha for the Lodwick (α = 0.72), modified Lodwick (α = 0.69), and Enneking classification (α = 0.63). Self-agreement for individuals ranged from 39–78%. Lesions were correctly classified as malignant for 73.3%, 59.0%, and 62% of cases for the three classification systems, respectively. Conclusions Our data demonstrate that three common classifications for osseous radiolucent lesions are neither reliable nor reproducible. Consistency of classification varied depending on lesion characteristics, with the strongest reproducibility demonstrated for the highest and lowest grades of the classification systems. There was no association between orthopedic experience and intra-observer reproducibility. These deficiencies may be improved with AI applications.

Publisher

Springer Science and Business Media LLC

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3