Abstract
Schönbrodt et al. (2022) and Gärtner et al. (2022) aim to outline in the target articles why and how research assessment could be improved in psychological science in accordance with DORA, resulting in a focus on abandoning the impact factor as an indicator for research quality and aligning assessment with methodological rigor and open science practices. However, I argue that their attempt is guided by a rather narrow statistical and quantitative understanding of knowledge production in psychological science. Consequently, the authors neglect the epistemic diversity within psychological science, leading to the potential danger of committing epistemic injustice. Hence, the criteria they introduce for research assessment might be appropriate for some approaches to knowledge production; it could, however, neglect or systematically disadvantage others. Furthermore, I claim that the authors lack some epistemic (intellectual) humility about their proposal. Further information is required regarding when and for which approaches their proposal is appropriate and, maybe even more importantly, when and where it is not. Similarly, a lot of the proposed improvements of the reform movement, like the one introduced in the target articles, are probably nothing more than trial and error due to a lack of investigation of their epistemic usefulness and understanding of underlying mechanisms and theories. Finally, I argue that with more awareness about epistemic diversity in psychological science in combination with more epistemic (intellectual) humility, the danger of epistemic injustice could be attenuated.
Reference22 articles.
1. Agreement on reforming research assessment. (2022). https://coara.eu/agreement/the-agreementfull-text/
2. Barba, L. A. (2018). Terminologies for reproducible research. https://doi.org/https://doi.org/10.48550/ARXIV.1802.03311
3. Dames, H., Musfeld, P., Popov, V., Oberauer, K., & Frischkorn, G. T. (2023). Responsible research assessment should prioritize theory development and testing over ticking open science boxes. https://doi.org/10.31234/osf.io/ad74m
4. Devezer, D. J., Navarro, Vandekerckhove, J., & Buzbas, E. O. (2021). The case for formal methodology in scientific reform. Royal Society Open Science, 8(3), 200805-200805. https://doi.org/10.1098/rsos.200805
5. Gärtner, A., Leising, D., & Schönbrodt, F. D. (2022). Responsible research assessment ii: A specific proposal for hiring and promotion in psychology. https://doi.org/10.31234/osf.io/5yexm