Abstract
PurposeThe purpose of this paper is to examine the effects of the use of the citation‐based journal impact factor for evaluative purposes upon the behaviour of authors and editors. It seeks to give a critical examination of a number of claims as regards the manipulability of this indicator on the basis of an empirical analysis of publication and referencing practices of authors and journal editorsDesign/methodology/approachThe paper describes mechanisms that may affect the numerical values of journal impact factors. It also analyses general, “macro” patterns in large samples of journals in order to obtain indications of the extent to which such mechanisms are actually applied on a large scale. Finally it presents case studies of particular science journals in order to illustrate what their effects may be in individual cases.FindingsThe paper shows that the commonly used journal impact factor can to some extent be relatively easily manipulated. It discusses several types of strategic editorial behaviour, and presents cases in which journal impact factors were – intentionally or otherwise – affected by particular editorial strategies. These findings lead to the conclusion that one must be most careful in interpreting and using journal impact factors, and that authors, editors and policy makers must be aware of their potential manipulability. They also show that some mechanisms occur as of yet rather infrequently, while for others it is most difficult if not impossible to assess empirically how often they are actually applied. If their frequency of occurrence increases, one should come to the conclusion that the impact of impact factors is decreasing.Originality/valueThe paper systematically describes a number of claims about the manipulability of journal impact factors that are often based on “informal” or even anecdotal evidences and illustrates how these claims can be further examined in thorough empirical research of large data samples.
Subject
Library and Information Sciences,Information Systems
Reference26 articles.
1. Agrawal, A.A. (2005), “Corruption of journal impact factors”, Trends in Ecology and Evolution, Vol. 20, p. 157.
2. Bollen, J., Van De Sompel, H., Smith, J.A. and Luce, R. (2005), “Towards alternative metrics of journal impact: a comparison of download and citation data. Information 41. Nog opnemen in endnote”, Processing and Management, Vol. 41, pp. 1419‐40.
3. Garfield, E. (1970), “Would Mendel's work have been ignored if the Science Citation Index was available 100 years ago?”, Current Contents, Vol. 47, pp. 5‐6, available at: www.garfield.library.upenn.edu/essays/V1p069y1962‐73.pdf.
4. Garfield, E. (1972), “Citation analysis as a tool”, Journal Evaluation, Vol. 178, pp. 471‐9.
5. Garfield, E. (1979), Citation Indexing. Its Theory and Application in Science, Technology and Humanities, Wiley, New York, NY.
Cited by
55 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献