Co‐designing enduring learning analytics prediction and support tools in undergraduate biology courses

Author:

Plumley Robert D.1ORCID,Bernacki Matthew L.12ORCID,Greene Jeffrey A.1ORCID,Kuhlmann Shelbi3ORCID,Raković Mladen4ORCID,Urban Christopher J.5ORCID,Hogan Kelly A.6ORCID,Lee Chaewon7ORCID,Panter Abigail T.7ORCID,Gates Kathleen M.7

Affiliation:

1. School of Education University of North Carolina at Chapel Hill Chapel Hill North Carolina USA

2. Department of Education Brain and Motivation Research Institute Korea University Seoul South Korea

3. Department of Psychology Institute for Intelligent Systems University of Memphis Memphis Tennessee USA

4. Department of Human Centred Computing Monash University Melbourne Victoria Australia

5. Department of Psychology University of Rhode Island Kingston Rhode Island USA

6. Trinity College of Arts & Sciences Duke University Durham North Carolina USA

7. Department of Psychology & Neuroscience University of North Carolina at Chapel Hill Chapel Hill North Carolina USA

Abstract

AbstractEven highly motivated undergraduates drift off their STEM career pathways. In large introductory STEM classes, instructors struggle to identify and support these students. To address these issues, we developed co‐redesign methods in partnership with disciplinary experts to create high‐structure STEM courses that better support students and produce informative digital event data. To those data, we applied theory‐ and context‐relevant labels to reflect active and self‐regulated learning processes involving LMS‐hosted course materials, formative assessments, and help‐seeking tools. We illustrate the predictive benefits of this process across two cycles of model creation and reapplication. In cycle 1, we used theory‐relevant features from 3 weeks of data to inform a prediction model that accurately identified struggling students and sustained its accuracy when reapplied in future semesters. In cycle 2, we refit a model with temporally contextualized features that achieved superior accuracy using data from just two class meetings. This modelling approach can produce durable learning analytics solutions that afford scaled and sustained prediction and intervention opportunities that involve explainable artificial intelligence products. Those same products that inform prediction can also guide intervention approaches and inform future instructional design and delivery.Practitioner notesWhat is already known about this topic Learning analytics includes an evolving collection of methods for tracing and understanding student learning through their engagements with learning technologies. Prediction models based on demographic data can perpetuate systemic biases. Prediction models based on behavioural event data can produce accurate predictions of academic success, and validation efforts can enrich those data to reflect students' self‐regulated learning processes within learning tasks. What this paper adds Learning analytics can be successfully applied to predict performance in an authentic postsecondary STEM context, and the use of context and theory as guides for feature engineering can ensure sustained predictive accuracy upon reapplication. The consistent types of learning resources and cyclical nature of their provisioning from lesson to lesson are hallmarks of high‐structure active learning designs that are known to benefit learners. These designs also provide opportunities for observing and modelling contextually grounded, theory‐aligned and temporally positioned learning events that informed prediction models that accurately classified students upon initial and later reapplications in subsequent semesters. Co‐design relationships where researchers and instructors work together toward pedagogical implementation and course instrumentation are essential to developing unique insights for feature engineering and producing explainable artificial intelligence approaches to predictive modelling. Implications for practice and/or policy High‐structure course designs can scaffold student engagement with course materials to make learning more effective and products of feature engineering more explainable. Learning analytics initiatives can avoid perpetuation of systemic biases when methods prioritize theory‐informed behavioural data that reflect learning processes, sensitivity to instructional context and development of explainable predictors of success rather than relying on students' demographic characteristics as predictors. Prioritizing behaviours as predictors improves explainability in ways that can inform the redesign of courses and design of learning supports, which further informs the refinement of learning theories and their applications.

Funder

National Science Foundation

Publisher

Wiley

Cited by 1 articles. 订阅此论文施引文献 订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3