Abstract
PurposeThrough a two-stage survey, this paper examines how researchers judge the quality of answers on ResearchGate Q&A, an academic social networking site.Design/methodology/approachIn the first-stage survey, 15 researchers from Library and Information Science (LIS) judged the quality of 157 answers to 15 questions and reported the criteria that they had used. The content of their reports was analyzed, and the results were merged with relevant criteria from the literature to form the second-stage survey questionnaire. This questionnaire was then completed by researchers recognized as accomplished at identifying high-quality LIS answers on ResearchGate Q&A.FindingsMost of the identified quality criteria for academic answers—such as relevance, completeness, and verifiability—have previously been found applicable to generic answers. The authors also found other criteria, such as comprehensiveness, the answerer's scholarship, and value-added. Providing opinions was found to be the most important criterion, followed by completeness and value-added.Originality/valueThe findings here show the importance of studying the quality of answers on academic social Q&A platforms and reveal unique considerations for the design of such systems.
Subject
Library and Information Sciences,Computer Science Applications,Information Systems
Reference51 articles.
1. User-defined relevance criteria: an exploratory study;Journal of the American Society for Information Science,1994
2. Predictors of high‐quality answers;Online Information Review,2012
3. Quality versus quantity: contradictions in LIS journal publishing in China;Library Management,2001
4. Using blog content depth and breadth to access and classify blogs;International Journal of Business and Information,2010
Cited by
15 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献