Affiliation:
1. Moscow City University
2. , Moscow City University
Abstract
This paper reviews the key research of the automatic engagement detection in education. Automatic engagement detection is necessary in enhancing educational process, there is a lack of out-of-the-box technical solutions. Engagement can be detected while tracing learning-centered affects: interest, confusion, frustration, delight, anger, boredom, and their facial and bodily expressions. Most of the researchers reveal these emotions on video using Facial Action Coding System (FACS). But there doesn’t exist a set of ready-made criteria to detect engagement and many scientists use additional techniques like self-reports, audio-data, physiological indicators and others. In this paper we present a review of most recent researches in the field of automatic affect and engagement detection and present our theoretical model of engagement in educational process based on the learning-centered affects’s detection. Engagement is understood as an affective and cognitive state, accompanying learning process. While reaching optimal engagement students experience various affects, where highly positive and negative feelings mean that a student is close to be engaged in the learning process.
Publisher
Federal State-Financed Educational Institution of Higher Education Moscow State University of Psychology and Education
Reference42 articles.
1. Il'in E.P. Psikhofiziologiya sostoyanii cheloveka [Psychophysiology of human states]. St. Petersburg: Piter, 2005. 412 p.(In Russ.).
2. Kupriyanov R.B. Primenenie tekhnologii komp'yuternogo zreniya dlya avtomaticheskogo sbora dannykh ob emotsiyakh obuchayushchikhsya vo vremya gruppovoi raboty [Application of computer vision technologies for automatic collection of data about students' emotions during group work]. Informatika i obrazovanie [Informatics and Education],2020. Vol. 314, no. 5,pp. 56â63. DOI:10.32517/0234-0453-2020-35-5-56-63(In Russ.).
3. Altuwairqi K. et al. A new emotionâbased affective model to detect studentâs engagement. Journal of King Saud University â Computer and Information Sciences, 2019. In Press. DOI:10.1016/j.jksuci.2018.12.008
4. Zeng Z. et al. A survey of affect recognition methods: Audio, visual, and spontaneous expressions. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2009. Vol. 31, no. 1, pp. 39â58. DOI:10.1109/TPAMI.2008.52
5. Craig S. et al. Affect and learning: An exploratory look into the role of affect in learning with AutoTutor. Journal of Educational Media, 2004. Vol. 29, no. 3, pp. 241â250. DOI:10.1080/1358165042000283101
Cited by
4 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献