Abstract
AbstractCollaboration is argued to be an important skill, not only in schools and higher education contexts but also in the workspace and other aspects of life. However, simply asking students to work together as a group on a task does not guarantee success in collaboration. Effective collaborative learning requires meaningful interactions among individuals in a group. Recent advances in multimodal data collection tools and AI provide unique opportunities to analyze, model and support these interactions. This study proposes an original method to identify group interactions in real-world collaborative learning activities and investigates the variations in interactions of groups with different collaborative learning outcomes. The study was conducted in a 10-week long post-graduate course involving 34 students with data collected from groups’ weekly collaborative learning interactions lasting ~ 60 min per session. The results showed that groups with different levels of shared understanding exhibit significant differences in time spent and maximum duration of referring and following behaviours. Further analysis using process mining techniques revealed that groups with different outcomes exhibit different patterns of group interactions. A loop between students’ referring and following behaviours and resource management behaviours was identified in groups with better collaborative learning outcomes. The study indicates that the nonverbal behaviours studied here, which can be auto-detected with advanced computer vision techniques and multimodal data, have the potential to distinguish groups with different collaborative learning outcomes. Insights generated can also support the practice of collaborative learning for learners and educators. Further research should explore the cross-context validity of the proposed distinctions and explore the approach’s potential to be developed as a real-world, real-time support system for collaborative learning.
Publisher
Springer Science and Business Media LLC
Subject
Library and Information Sciences,Education
Reference61 articles.
1. Alwahaby, H., Cukurova, M. (2023). Navigating the ethical landscape of Multimodal Learning Analytics: A guiding framework for research and practitioners. In S. Caballé, J. Casas-Roma, J. Conesa (Eds.), Ethics in online aibased system. Elsevier. https://shop.elsevier.com/books/ethics-in-online-ai-based-systems/caballe/978-0-443-18851-0
2. Alwahaby, H., Cukurova, M., Papamitsiou, Z., & Giannakos, M. (2022). The evidence of impact and ethical considerations of Multimodal Learning Analytics: A systematic literature review. The Multimodal Learning Analytics Handbook, 289–325. https://link.springer.com/chapter/10.1007/978-3-031-08076-0_12
3. Alzahrani, A. S., Tsai, Y., Iqbal, S., Marcos, P. M. M., Scheffel, M., Drachsler, H., Kloos, C. D., Aljohani, N., & Gasevic, D. (2023). Untangling connections between challenges in the adoption of learning analytics in higher education. Education and Information Technologies, 28, 4563–4595. https://doi.org/10.1007/s10639-022-11323-x
4. Amon, M. J., Vrzakova, H., & D’Mello, S. K. (2019). Beyond dyadic coordination: Multimodal behavioral irregularity in triads predicts facets of collaborative problem solving. Cognitive Science, 43(10), e12787.
5. Bohm, D., & Weinberg, R. A. (2004). On dialogue (2nd ed.). Routledge. https://doi.org/10.4324/9780203822906
Cited by
5 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献