Abstract
When used effectively, reflective writing tasks can deepen learners’ understanding of key concepts, help them critically appraise their developing professional identity, and build qualities for lifelong learning. As such, reflecting writing is attracting substantial interest from universities concerned with experiential learning, reflective practice, and developing a holistic conception of the learner. However, reflective writing is for many students a novel genre to compose in, and tutors may be inexperienced in its assessment. While these conditions set a challenging context for automated solutions, natural language processing may also help address the challenge of providing real time, formative feedback on draft writing. This paper reports progress in designing a writing analytics application, detailing the methodology by which informally expressed rubrics are modelled as formal rhetorical patterns, a capability delivered by a novel web application. Preliminary tests on an independently human-annotated corpus are encouraging, showing improvements from the first to second version, but with much scope for improvement. We discuss a range of issues: the prevalence of false positives in the tests, areas for futures technical improvements, the risks of gaming the system, and the participatory design process that has enabled work across disciplinary boundaries to develop the prototype to its current state.
Publisher
Society for Learning Analytics Research
Subject
Computer Science Applications,Education
Cited by
24 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献