Abstract
AbstractIn the article “Evaluating institutional open access performance: Methodology, challenges and assessment” we develop the first comprehensive and reproducible workflow that integrates multiple bibliographic data sources for evaluating institutional open access (OA) performance. The major data sources include Web of Science, Scopus, Microsoft Academic, and Unpaywall. However, each of these databases continues to update, both actively and retrospectively. This implies the results produced by the proposed process are potentially sensitive to both the choice of data source and the versions of them used. In addition, there remain the issue relating to selection bias in sample size and margin of error. The current work shows that the levels of sensitivity relating to the above issues can be significant at the institutional level. Hence, the transparency and clear documentation of the choices made on data sources (and their versions) and cut-off boundaries are vital for reproducibility and verifiability.
Publisher
Cold Spring Harbor Laboratory
Reference5 articles.
1. Open Access to the Scientific Journal Literature: Situation 2009
2. Huang, C.-K. , Neylon, C. , Brookes-Kenworthy, C. , Hosking, R. , Montgomery, L. , Wilson, K. , & Ozaygen, A. (2020a). Comparison of bibliographic data sources: Implications for the robustness of university rankings. Quantitative Science Studies: Just Accepetd, 1–54. https://doi.org/10.1162/qss_a_00031
3. Huang, C.-K. (Karl), Neylon, C. , Hosking, R. , Brookes-Kenworthy, C. , Montgomery, L. , Wilson, K. , & Ozaygen, A. (2020b). Evaluating institutional open access performance: Methodology, challenges and assessment. Under Review, 1–18. https://doi.org/10.5281/zenodo.3694943
4. The Development of Open Access Journal Publishing from 1993 to 2009
5. Status of open access in the biomedical field in 2005. Journal of the Medical Library Association;JMLA,2009