Evaluation of editors’ abilities to predict the citation potential of research manuscripts submitted to The BMJ : a cohort study

Author:

Schroter SaraORCID,Weber Wim E J,Loder Elizabeth,Wilkinson Jack,Kirkham Jamie J

Abstract

Abstract Objective To evaluate the ability of The BMJ editors to predict the number of times submitted research manuscripts will be cited. Design Cohort study. Setting Manuscripts submitted to The BMJ , reviewed, and subsequently scheduled for discussion at a prepublication meeting between 27 August 2015 and 29 December 2016. Participants 10 BMJ research team editors. Main outcome measures Reviewed manuscripts were rated independently by attending editors for citation potential in the year of first publication plus the next year: no citations, below average (<10 citations), average (10-17 citations), or high (>17 citations). Predicted citations were subsequently compared with actual citations extracted from Web of Science (WOS). Results Of 534 manuscripts reviewed, 505 were published as full length articles (219 in The BMJ) by end of 2019 and indexed in WOS, 22 were unpublished, and one abstract was withdrawn. Among the 505 manuscripts, the median (IQR [range]) number of citations in the year of publication plus the following year was 9 (4-17 [0-150]); 277 (55%) manuscripts were cited <10 times, 105 (21%) were cited 10-17 times, and 123 (24%) cited >17 times. Manuscripts accepted by The BMJ were cited more highly (median 12 (IQR 7-24) citations) than those rejected (median 7 (3-12) citations). For all 10 editors, predicted ratings tended to increase in line with actual citations, but with considerable variation within categories; nine failed to identify the correct citation category for >50% (range 31%-52%) of manuscripts, and κ ranged between 0.01 to 0.19 for agreement between predicted and actual categories. Editors more often rated papers that achieved high actual citation counts as having low citation potential than the reverse. Collectively, the mean percentage of editors predicting the correct citation category was 43%, and for 160 (32%) manuscripts at least 50% of editors predicted the right category. Conclusions Editors weren’t good at estimating the citation potential of manuscripts individually or as a group; there is no wisdom of the crowd when it comes to BMJ editors.

Publisher

BMJ

Subject

General Engineering

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3