Abstract
A Review of:
Rodriguez, S., & Mune, C. (2022). Uncoding library chatbots: Deploying a new virtual reference tool at the San Jose State University Library. Reference Services Review, 50(3), 392-405. https://doi.org/10.1108/RSR-05-2022-0020
Objective – To describe the development of an artificial intelligence (AI) chatbot to support virtual reference services at an academic library.
Design – Case study.
Setting – A public university library in the United States.
Subjects – 1,682 chatbot-user interactions.
Methods – A university librarian and two graduate student interns researched and developed an AI chatbot to meet virtual reference needs. Developed using chatbot development software, Dialogflow, the chatbot was populated with questions, keywords, and other training phrases entered during user inquiries, text-based responses to inquiries, and intents (i.e., programmed mappings between user inquiries and chatbot responses). The chatbot utilized natural language processing and AI training for basic circulation and reference questions, and included interactive elements and embeddable widgets supported by Kommunicate (i.e., a bot support platform for chat widgets). The chatbot was enabled after live reference hours were over. User interactions with the chatbot were collected across 18 months since its launch. The authors used analytics from Kommunicate and Dialogflow to examine user interactions.
Main Results – User interactions increased gradually since the launch of the chatbot. The chatbot logged approximately 44 monthly interactions during the spring 2021 term, which increased to approximately 137 monthly interactions during the spring 2022 term. The authors identified the most common reasons for users to engage the chatbot, using the chatbot’s triggered intents from user inquiries. These reasons included information about hours for the library building and live reference services, finding library resources (e.g., peer-reviewed articles, books), getting help from a librarian, locating databases and research guides, information about borrowing library items (e.g., laptops, books), and reporting issues with library resources.
Conclusion – Libraries can successfully develop and train AI chatbots with minimal technical expertise and resources. The authors offered user experience considerations from their experience with the project, including editing library FAQs to be concise and easy to understand, testing and ensuring chatbot text and elements are accessible, and continuous maintenance of chatbot content. Kommunicate, Dialogflow, Google Analytics, and Crazy Egg (i.e., a web usage analytics tool) could not provide more in-depth user data (e.g., user clicks, scroll maps, heat maps), with plans to further explore other usage analysis software to collect the data. The authors noted that only 10% of users engaged the chatbot beyond the initial welcome prompt, requiring more research and user testing on how to facilitate user engagement.
Publisher
University of Alberta Libraries
Reference6 articles.
1. Glynn, L. (2006). A critical appraisal tool for library and information research. Library Hi Tech, 24(3), 387-399. http://dx.doi.org/10.1108/07378830610692154
2. Guy, J., Rival, P. R., Lewis, C. J., & Groome, K. (2023). Reference chatbots in Canadian academic libraries. Information Technology and Libraries, 42(4). https://doi.org/10.5860/ital.v42i4.16511
3. Kane, D. (2019). Creating, managing and analyzing an academic library chatbot. BiD: Textos Universitaris de Biblioteconomia i Documentació, 43(2019). https://bid.ub.edu/en/43/kane.htm
4. Mckie, I. A. S., & Narayan, B. (2019). Enhancing the academic library experience with chatbots: An exploration of research and implications for practice. Journal of the Australian Library and Information Association, 68(3), 268-277. https://doi.org/10.1080/24750158.2019.1611694
5. McNeal, M. L., & Newyear, D. (2013). Chapter 1: Introducing chatbots in libraries. Library Technology Reports, 49(8), 5-10. https://www.journals.ala.org/index.php/ltr/article/view/4504/5281