Author:
Huang Wen,Walkington Candace,Nathan Mitchell J.
Abstract
Abstract
This study investigates how learners collaboratively construct embodied geometry knowledge in shared VR environments. Three groups of in-service teachers collaboratively explored six geometric conjectures with various virtual objects (geometric shapes) under the guidance of a facilitator. Although all the teachers were in different physical locations, they logged into a single virtual classroom with their respective groups and were able to see and manipulate the same geometric shapes as well as see their collaborators’ avatars and actions on the shapes in real time in the shared virtual space. This paper introduces a novel multimodal data analysis method for analyzing participants’ interactive patterns in collaborative forms of actions, gestures, movements, and speech. Results show that collaborative speech has a strong simultaneous relationship with actions on virtual objects and virtual hand gestures. They also showed that body movements and positions, which often focus on virtual objects and shifts in these movements away from or around the object, often signal key interactional collaborative events. In addition, this paper presents five emergent multimodality interaction themes showing participants’ collaborative patterns in different problem-solving stages and their different strategies in collaborative problem-solving. The results show that virtual objects can be effective media for collaborative knowledge building in shared VR environments, and that structured activity design and moderate realism may benefit shared VR learning environments in terms of equity, adaptability, and cost-effectiveness. We show how multimodal data analysis can be multi-dimensional, visualized, and conducted at both micro and macro levels.
Funder
Institute of Education Sciences
Southern Methodist University
Publisher
Springer Science and Business Media LLC
Subject
Human-Computer Interaction,Education
Cited by
4 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献