Abstract
With the increase of distance learning, in general, and e-learning, in particular, having a system capable of determining the engagement of students is of primordial importance, and one of the biggest challenges, both for teachers, researchers and policy makers. Here, we present a system to detect the engagement level of the students. It uses only information provided by the typical built-in web-camera present in a laptop computer, and was designed to work in real time. We combine information about the movements of the eyes and head, and facial emotions to produce a concentration index with three classes of engagement: “very engaged”, “nominally engaged” and “not engaged at all”. The system was tested in a typical e-learning scenario, and the results show that it correctly identifies each period of time where students were “very engaged”, “nominally engaged” and “not engaged at all”. Additionally, the results also show that the students with best scores also have higher concentration indexes.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Whitehill, J., Serpell, Z., Lin, Y.-C., Foster, A., Movellan, J.R.: The faces of engagement: Automatic recognition of student engagementfrom facial expressions. IEEE Trans. Affect. Comput. 5(1), 86–98 (2014)
Lamborn, S., Newmann, F., Wehlage, G.: The significance and sources of student engagement. Student engagement and achievement in American secondary schools (1992)
Skinner, E.A., Zimmer-Gembeck, M.J., Connell, J.P., Eccles, J.S., Wellborn, J.G.: Individual differences and the development of perceived control. Monographs of the society for Research in Child Development. 231 (1998)
Barbour, M.K., Reeves, T.C.: The reality of virtual schools: A review of the literature. Comput. Educ. 52(2), 402–416 (2009)
Reyes, M.R., Brackett, M.A., Rivers, S.E., White, M., Salovey, P.: Classroom emotional climate, student engagement, and academic achievement. Journal of educational psychology 104(3), 700 (2012)
Stanley, O., Hansen, G.: ABSTUDY: An Investment for Tomorrow’s Employment: a Review of ABSTUDY for the Aboriginal and Torres Strait Islander Commission. Commonwealth of Australia (1998)
Bradbury, N.A.: Attention span during lectures: 8 seconds, 10 minutes, or more? American Physiological Society, Bethesda, MD (2016)
Larson, R.W., Richards, M.H.: Boredom in the middle school years: Blaming schools versus blaming students. Am. J. Educ. 99(4), 418–443 (1991)
Ekman, P., Friesen, W.V., Ellsworth, P.: Emotion in the human face: Guidelines for research and an integration of findings. Elsevier (2013)
Lorigo, L., et al.: Eye tracking and online search: Lessons learned and challenges ahead. J. Am. Soc. Inform. Sci. Technol. 59(7), 1041–1052 (2008)
Poole, A., Ball, L.J.: Eye tracking in HCI and usability research. In: Encyclopedia of human computer interaction. IGI global, pp. 211–219 (2006)
Hsu, C.-C., Chen, H.-C., Su, Y.-N., Huang, K.-K., Huang, Y.-M.: Developing a reading concentration monitoring system by applying an artificial bee colony algorithm to e-books in an intelligent classroom. Sensors. 12(10), 14158–14178 (2012)
Yi, J., Sheng, B., Shen, R., Lin, W., Wu, E.: Real time learning evaluation based on gaze tracking. In: 2015 14th International Conference on Computer-Aided Design and Computer Graphics (CAD/Graphics), IEEE. pp. 157–164 (2015)
Bradley, M.M., Miccoli, L., Escrig, M.A., Lang, P.J.: The pupil as a measure of emotional arousal and autonomic activation. Psychophysiology 45(4), 602–607 (2008)
Divjak, M., Bischof, H.: Eye blink based fatigue detection for prevention of computer vision syndrome. In: MVA, pp. 350–353 (2009)
Turabzadeh, S., Meng, H., Swash, R.M., Pleva, M., Juhar, J.: Facial expression emotion detection for real-time embedded systems. Technologies 6(1), 17 (2018)
Bidwell, J., Fuchs, H.: Classroom analytics: Measuring student engagement with automated gaze tracking. Behav Res Methods. 49, 113 (2011)
Krithika, L.B., GG, L.P.: Student emotion recognition system (SERS) for e-learning improvement based on learner concentration metric. Procedia Computer Science 85, 767–776 (2016)
Kamath, A., Biswas, A., Balasubramanian, V.: A crowdsourced approach to student engagement recognition in e-learning environments. In: 2016 IEEE Winter Conference on Applications of Computer Vision (WACV), IEEE. pp. 1–9 (2016)
Sharma, P., Esengönül, M., Khanal, S.R., Khanal, T.T., Filipe, V., Reis, M.J.C.S.: Student concentration evaluation index in an e-learning context using facial emotion analysis. In: International Conference on Technology and Innovation in Learning, pp. 529–538. Teaching and Education, Springer (2018)
Mohamad Nezami, O., Dras, M., Hamey, L., Richards, D., Wan, S., Paris, C.: Automatic recognition of student engagement using deep learning and facial expression. In: Joint European Conference on Machine Learning and Knowledge Discovery in Databases, pp. 273–289. Springer, Cham (2020)
Shen, J., Yang, H., Li, J., Cheng, Z.: Assessing learning engagement based on facial expression recognition in MOOC’s scenario. Multimedia Syst. 28, 469–478 (2022)
Alkabbany, I., Ali, A., Farag, A., Bennett, I., Ghanoum, M., Farag, A.: Measuring student engagement level using facial information. In: 2019 IEEE International Conference on Image Processing ICIP, pp. 3337–3341. IEEE (2019)
Paul, V.: Rapid object detection using a boosted cascade of simple features. In: Proc. CVPR (2001)
O’Shea, K., Nash, R.: An introduction to convolutional neural networks (2015)
Arriaga, O., Valdenegro-Toro, M., Plöger, P.: Real-time convolutional neural networks for emotion and gender classification (2017)
Chollet, F.: Xception: Deep learning with depthwise separable convolutions. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp. 1251–1258 (2017)
Grafsgaard, J.F., Fulton, R.M., Boyer, K.E., Wiebe, E.N., Lester, J.C.: Multimodal analysis of the implicit affective channel in computer-mediated textual communication. In: Proceedings of the 14th ACM international conference on Multimodal interaction, pp. 145–152 (2012)
Ekenel, H.K., Stiefelhagen, R.: Why is facial occlusion a challenging problem? In: International conference on biometrics, pp. 299–308. Springer (2009)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Sharma, P. et al. (2022). Student Engagement Detection Using Emotion Analysis, Eye Tracking and Head Movement with Machine Learning. In: Reis, A., Barroso, J., Martins, P., Jimoyiannis, A., Huang, R.YM., Henriques, R. (eds) Technology and Innovation in Learning, Teaching and Education. TECH-EDU 2022. Communications in Computer and Information Science, vol 1720. Springer, Cham. https://doi.org/10.1007/978-3-031-22918-3_5
Download citation
DOI: https://doi.org/10.1007/978-3-031-22918-3_5
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-22917-6
Online ISBN: 978-3-031-22918-3
eBook Packages: Computer ScienceComputer Science (R0)