Author:
Zhou Qingqing,Zhang Chengzhi
Abstract
Purpose
As for academic papers, the customary methods for assessing the impact of books are based on citations, which is straightforward but limited to the coverage of databases. Alternative metrics can be used to avoid such limitations, such as blog citations and library holdings. However, content-level information is generally ignored, thus overlooking users’ intentions. Meanwhile, abundant academic reviews express scholars’ opinions on books, which can be used to assess books’ impact via fine-grained review mining. Hence, this study aims to assess books’ use impacts by conducting content mining of academic reviews automatically and thereby confirmed the usefulness of academic reviews to libraries and readers.
Design/methodology/approach
Firstly, 61,933 academic reviews in Choice: Current Reviews for Academic Libraries were collected with three metadata metrics. Then, review contents were mined to obtain content metrics. Finally, to identify the reliability of academic reviews, Choice review metrics and other assessment metrics for use impact were compared and analysed.
Findings
The analysis results reveal that fine-grained mining of academic reviews can help users quickly understand multi-dimensional features of books, judge or predict the impacts of mass books, so as to provide references for different types of users (e.g. libraries and public readers) in book selection.
Originality/value
Book impact assessment via content mining can provide more detail information for massive users and cover shortcomings of traditional methods. It provides a new perspective and method for researches on use impact assessment. Moreover, this study’s proposed method might also be a means by which to measure other publications besides books.
Subject
Library and Information Sciences,Computer Science Applications
Reference53 articles.
1. Can the impact of non-western academic books be measured? An investigation of Google books and Google scholar for Malaysia;Journal of the Association for Information Science and Technology,2014
2. Assessing the digital library research output: bibliometric analysis from 2002 to 2016;The Electronic Library,2018
3. Sentiment analysis using supervised classification algorithms,2017
4. Three options for citation tracking: Google scholar, Scopus and Web of Science;Biomedical Digital Libraries,2006
5. Citations to the 'introduction to informetrics' indexed by WOS, Scopus and Google Scholar;Scientometrics,2010
Cited by
4 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献