Author:
Agada Ruth,Yan Jie,Xu Weifeng
Publisher
Springer International Publishing
Reference39 articles.
1. Grafsgaard, J.F., Wiggins, J.B., Boyer, K.E.: Wiebe, E.N., Lester, J.C.: Predicting learning and affect from multimodal data streams in task-oriented tutorial dialogue. In: Proceedings 7th International Conference on Educational Data Mining, Edm, pp. 122–129 (2014)
2. Grafsgaard, J.F., Wiggins, J.B., Boyer, K.E., Wiebe, E.N., Lester, J.C.: Automatically recognizing facial indicators of frustration: a learning-centric analysis. In: Proceedings - 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction, ACII 2013, pp. 159–165 (2013)
3. Whitehill, J., Serpell, Z., Foster, A., Lin, Y.-C., Pearson, B., Bartlett, M., Movellan, J.: Towards an optimal affect-sensitive instructional system of cognitive skills. In: Computer Vision and Pattern Recognition Workshop 2011, pp. 20–25 (2011)
4. Bellegarda, J.R.: A data-driven affective analysis framework toward naturally expressive speech synthesis. IEEE Trans. Audio Speech Lang. Process. 19(5), 1113–1122 (2010)
5. D’Mello, S., Dale, R., Graesser, A.: Disequilibrium in the mind, disharmony in the body. Cogn. Emot. 26(2), 362–374 (2012)