Abstract
PurposeThis paper aims to quantify the quality of peer reviews, evaluate them from different perspectives and develop a model to predict the review quality. In addition, this paper investigates effective features to distinguish the reviews' quality. Design/methodology/approachFirst, a fine-grained data set including peer review data, citations and review conformity scores was constructed. Second, metrics were proposed to evaluate the quality of peer reviews from three aspects. Third, five categories of features were proposed in terms of reviews, submissions and responses using natural language processing (NLP) techniques. Finally, different machine learning models were applied to predict the review quality, and feature analysis was performed to understand effective features.FindingsThe analysis results revealed that reviewers become more conservative and the review quality becomes worse over time in terms of these indicators. Among the three models, random forest model achieves the best performance on all three tasks. Sentiment polarity, review length, response length and readability are important factors that distinguish peer reviews’ quality, which can help meta-reviewers value more worthy reviews when making final decisions.Originality/valueThis study provides a new perspective for assessing review quality. Another originality of the research lies in the proposal of a novelty task that predict review quality. To address this task, a novel model was proposed which incorporated various of feature sets, thereby deepening the understanding of peer reviews.
Subject
Library and Information Sciences,Computer Science Applications
Reference43 articles.
1. Multiple instance learning networks for fine-grained sentiment analysis;Transactions of the Association for Computational Linguistics,2018
2. PaRe: a paper-reviewer matching approach using a common topic space,2019
3. Peer grading the peer reviews: a dual-role approach for lightening the scholarly paper review process,2021
4. Quantifying the quality of peer reviewers through Zipf’s law;Scientometrics,2016
5. Closed versus open reviewing of journal manuscripts: how far do comments differ in language use?;Scientometrics,2012
Cited by
4 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献