Nothing Special   »   [go: up one dir, main page]

Skip to main content

Student Engagement Detection Using Emotion Analysis, Eye Tracking and Head Movement with Machine Learning

  • Conference paper
  • First Online:
Technology and Innovation in Learning, Teaching and Education (TECH-EDU 2022)

Abstract

With the increase of distance learning, in general, and e-learning, in particular, having a system capable of determining the engagement of students is of primordial importance, and one of the biggest challenges, both for teachers, researchers and policy makers. Here, we present a system to detect the engagement level of the students. It uses only information provided by the typical built-in web-camera present in a laptop computer, and was designed to work in real time. We combine information about the movements of the eyes and head, and facial emotions to produce a concentration index with three classes of engagement: “very engaged”, “nominally engaged” and “not engaged at all”. The system was tested in a typical e-learning scenario, and the results show that it correctly identifies each period of time where students were “very engaged”, “nominally engaged” and “not engaged at all”. Additionally, the results also show that the students with best scores also have higher concentration indexes.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 79.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 99.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Whitehill, J., Serpell, Z., Lin, Y.-C., Foster, A., Movellan, J.R.: The faces of engagement: Automatic recognition of student engagementfrom facial expressions. IEEE Trans. Affect. Comput. 5(1), 86–98 (2014)

    Article  Google Scholar 

  2. Lamborn, S., Newmann, F., Wehlage, G.: The significance and sources of student engagement. Student engagement and achievement in American secondary schools (1992)

    Google Scholar 

  3. Skinner, E.A., Zimmer-Gembeck, M.J., Connell, J.P., Eccles, J.S., Wellborn, J.G.: Individual differences and the development of perceived control. Monographs of the society for Research in Child Development. 231 (1998)

    Google Scholar 

  4. Barbour, M.K., Reeves, T.C.: The reality of virtual schools: A review of the literature. Comput. Educ. 52(2), 402–416 (2009)

    Article  Google Scholar 

  5. Reyes, M.R., Brackett, M.A., Rivers, S.E., White, M., Salovey, P.: Classroom emotional climate, student engagement, and academic achievement. Journal of educational psychology 104(3), 700 (2012)

    Google Scholar 

  6. Stanley, O., Hansen, G.: ABSTUDY: An Investment for Tomorrow’s Employment: a Review of ABSTUDY for the Aboriginal and Torres Strait Islander Commission. Commonwealth of Australia (1998)

    Google Scholar 

  7. Bradbury, N.A.: Attention span during lectures: 8 seconds, 10 minutes, or more? American Physiological Society, Bethesda, MD (2016)

    Google Scholar 

  8. Larson, R.W., Richards, M.H.: Boredom in the middle school years: Blaming schools versus blaming students. Am. J. Educ. 99(4), 418–443 (1991)

    Article  Google Scholar 

  9. Ekman, P., Friesen, W.V., Ellsworth, P.: Emotion in the human face: Guidelines for research and an integration of findings. Elsevier (2013)

    Google Scholar 

  10. Lorigo, L., et al.: Eye tracking and online search: Lessons learned and challenges ahead. J. Am. Soc. Inform. Sci. Technol. 59(7), 1041–1052 (2008)

    Article  Google Scholar 

  11. Poole, A., Ball, L.J.: Eye tracking in HCI and usability research. In: Encyclopedia of human computer interaction. IGI global, pp. 211–219 (2006)

    Google Scholar 

  12. Hsu, C.-C., Chen, H.-C., Su, Y.-N., Huang, K.-K., Huang, Y.-M.: Developing a reading concentration monitoring system by applying an artificial bee colony algorithm to e-books in an intelligent classroom. Sensors. 12(10), 14158–14178 (2012)

    Article  Google Scholar 

  13. Yi, J., Sheng, B., Shen, R., Lin, W., Wu, E.: Real time learning evaluation based on gaze tracking. In: 2015 14th International Conference on Computer-Aided Design and Computer Graphics (CAD/Graphics), IEEE. pp. 157–164 (2015)

    Google Scholar 

  14. Bradley, M.M., Miccoli, L., Escrig, M.A., Lang, P.J.: The pupil as a measure of emotional arousal and autonomic activation. Psychophysiology 45(4), 602–607 (2008)

    Article  Google Scholar 

  15. Divjak, M., Bischof, H.: Eye blink based fatigue detection for prevention of computer vision syndrome. In: MVA, pp. 350–353 (2009)

    Google Scholar 

  16. Turabzadeh, S., Meng, H., Swash, R.M., Pleva, M., Juhar, J.: Facial expression emotion detection for real-time embedded systems. Technologies 6(1), 17 (2018)

    Google Scholar 

  17. Bidwell, J., Fuchs, H.: Classroom analytics: Measuring student engagement with automated gaze tracking. Behav Res Methods. 49, 113 (2011)

    Google Scholar 

  18. Krithika, L.B., GG, L.P.: Student emotion recognition system (SERS) for e-learning improvement based on learner concentration metric. Procedia Computer Science 85, 767–776 (2016)

    Google Scholar 

  19. Kamath, A., Biswas, A., Balasubramanian, V.: A crowdsourced approach to student engagement recognition in e-learning environments. In: 2016 IEEE Winter Conference on Applications of Computer Vision (WACV), IEEE. pp. 1–9 (2016)

    Google Scholar 

  20. Sharma, P., Esengönül, M., Khanal, S.R., Khanal, T.T., Filipe, V., Reis, M.J.C.S.: Student concentration evaluation index in an e-learning context using facial emotion analysis. In: International Conference on Technology and Innovation in Learning, pp. 529–538. Teaching and Education, Springer (2018)

    Google Scholar 

  21. Mohamad Nezami, O., Dras, M., Hamey, L., Richards, D., Wan, S., Paris, C.: Automatic recognition of student engagement using deep learning and facial expression. In: Joint European Conference on Machine Learning and Knowledge Discovery in Databases, pp. 273–289. Springer, Cham (2020)

    Google Scholar 

  22. Shen, J., Yang, H., Li, J., Cheng, Z.: Assessing learning engagement based on facial expression recognition in MOOC’s scenario. Multimedia Syst. 28, 469–478 (2022)

    Article  Google Scholar 

  23. Alkabbany, I., Ali, A., Farag, A., Bennett, I., Ghanoum, M., Farag, A.: Measuring student engagement level using facial information. In: 2019 IEEE International Conference on Image Processing ICIP, pp. 3337–3341. IEEE (2019)

    Google Scholar 

  24. Paul, V.: Rapid object detection using a boosted cascade of simple features. In: Proc. CVPR (2001)

    Google Scholar 

  25. O’Shea, K., Nash, R.: An introduction to convolutional neural networks (2015)

    Google Scholar 

  26. Arriaga, O., Valdenegro-Toro, M., Plöger, P.: Real-time convolutional neural networks for emotion and gender classification (2017)

    Google Scholar 

  27. Chollet, F.: Xception: Deep learning with depthwise separable convolutions. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp. 1251–1258 (2017)

    Google Scholar 

  28. Grafsgaard, J.F., Fulton, R.M., Boyer, K.E., Wiebe, E.N., Lester, J.C.: Multimodal analysis of the implicit affective channel in computer-mediated textual communication. In: Proceedings of the 14th ACM international conference on Multimodal interaction, pp. 145–152 (2012)

    Google Scholar 

  29. Ekenel, H.K., Stiefelhagen, R.: Why is facial occlusion a challenging problem? In: International conference on biometrics, pp. 299–308. Springer (2009)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Prabin Sharma .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Sharma, P. et al. (2022). Student Engagement Detection Using Emotion Analysis, Eye Tracking and Head Movement with Machine Learning. In: Reis, A., Barroso, J., Martins, P., Jimoyiannis, A., Huang, R.YM., Henriques, R. (eds) Technology and Innovation in Learning, Teaching and Education. TECH-EDU 2022. Communications in Computer and Information Science, vol 1720. Springer, Cham. https://doi.org/10.1007/978-3-031-22918-3_5

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-22918-3_5

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-22917-6

  • Online ISBN: 978-3-031-22918-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics