Abstract
Programming is a complex learning activity that involves coordination of cognitive processes and affective states. These aspects are often considered individually in computing education research, demonstrating limited understanding of how and when students learn best. This issue confines researchers to contextualize evidence-driven outcomes when learning behaviour deviates from pedagogical intentions. Multimodal learning analytics (MMLA) captures data essential for measuring constructs (e.g., cognitive load, confusion) that are posited in the learning sciences as important for learning, and cannot effectively be measured solely with the use of programming process data (IDE-log data). Thus, we augmented IDE-log data with physiological data (e.g., gaze data) and participants’ facial expressions, collected during a debugging learning activity. The findings emphasize the need for learning analytics that are consequential for learning, rather than easy and convenient to collect. In that regard, our paper aims to provoke productive reflections and conversations about the potential of MMLA to expand and advance the synergy of learning analytics and learning design among the community of educators from a post-evaluation design-aware process to a permanent monitoring process of adaptation.
Publisher
Society for Learning Analytics Research
Subject
Computer Science Applications,Education
Cited by
23 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献