Abstract
Objectives: Citations do not always equate endorsement, therefore it is important to understand the context of a citation. Researchers may heavily rely on a paper they cite, they may refute it entirely, or they may mention it only in passing, so an accurate classification of a citation is valuable for researchers and users. While AI solutions have emerged to provide a more nuanced meaning, the accuracy of these tools has yet to be determined. This project seeks to assess the accuracy of scite in assessing the meaning of citations in a sample of publications.
Methods: Using a previously established sample of systematic reviews that cited retracted publications, we conducted known item searching in scite, a tool that uses machine learning to categorize the meaning of citations. scite's interpretation of the citation's meaning was recorded, as was our assessment of the citation’s meaning. Citations were classified as mentioning, supporting or contrasting. Recall, precision, and f-measure were calculated to describe the accuracy of scite's assessment in comparison to human assessment.
Results: From the original sample of 324 citations, 98 citations were classified in scite. Of these, scite found that 2 were supporting and 96 were mentioning, while we determined that 42 were supporting, 39 were mentioning, and 17 were contrasting. Supporting citations had high precision and low recall, while mentioning citations had high recall and low precision. F-measures ranged between 0.0 and 0.58, representing low classification accuracy.
Conclusions: In our sample, the overall accuracy of scite's assessments was low. scite was less able to classify supporting and contrasting citations, and instead labeled them as mentioning. Although there is potential and enthusiasm for AI to make engagement with literature easier and more immediate, the results generated from AI differed significantly from the human interpretation.
Subject
General Earth and Planetary Sciences,General Environmental Science
Cited by
3 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献