Nothing Special   »   [go: up one dir, main page]

Skip to main content

Multi-componential Emotion Recognition in VR Using Physiological Signals

  • Conference paper
  • First Online:
AI 2022: Advances in Artificial Intelligence (AI 2022)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 13728))

Included in the following conference series:

Abstract

Emotion recognition affords new approaches ranging from context-awareness to the efficiency of system interaction with the ability to perceive and express emotions. While most studies are dominated by discrete and dimensional theoretical models of emotion, neuroscience analysis aligns with the multi-component interpretation of emotional phenomena. One such componential theory is the Component Process Model (CPM), with five synchronized components: appraisal, motivation, physiology, expression and feeling. However, limited attention has been paid to the systematic investigation of emotions assuming a full CPM. Therefore, we induced various emotions in this preliminary analysis using 27 interactive Virtual Reality (VR) games. We measured the manifestation of 28 participants across CPM components, 20 discrete emotion terms, heart activity, skin conductance, and facial electromyography. Our work aims to analyze the relationship between discrete theory-based emotions and the theoretically defined components with physiological measures. Further, we analyze the correlation between subjective expression terms with objective facial expressions. Our Machine Learning (ML) analysis reveals a significant relationship between emotions and full componential features with physiological signals. Further, our study presents the role of each CPM component in emotion differentiation.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 89.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 119.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    https://support.emteqlabs.com/monitoring-tools/supervision.

References

  1. Bassano, C., et al.: A VR game-based system for multimodal emotion data collection. In: MIG 2019. Association for Computing Machinery, New York (2019)

    Google Scholar 

  2. Boot, L.: Facial expressions in EEG/EMG recordings. Thesis (2009)

    Google Scholar 

  3. Cacioppo, J.T., Petty, R.E., Losch, M.E., Kim, H.S.: Electromyographic activity over facial muscle regions can differentiate the valence and intensity of affective reactions. J. Pers. Soc. Psychol. 50(2), 260 (1986)

    Article  Google Scholar 

  4. Chandra, V., Priyarup, A., Sethia, D.: Comparative study of physiological signals from Empatica E4 wristband for stress classification. In: Singh, M., Tyagi, V., Gupta, P.K., Flusser, J., Ören, T., Sonawane, V.R. (eds.) ICACDS 2021. CCIS, vol. 1441, pp. 218–229. Springer, Cham (2021). https://doi.org/10.1007/978-3-030-88244-0_21

    Chapter  Google Scholar 

  5. Domínguez-Jiménez, J.A., Campo-Landines, K.C., Martínez-Santos, J.C., Delahoz, E.J., Contreras-Ortiz, S.H.: A machine learning model for emotion recognition from physiological signals. Biomed. Signal Process. Control 55, 101646 (2020)

    Article  Google Scholar 

  6. Dupré, D., Tcherkassof, A., Dubois, M.: Emotions triggered by innovative products: a multi-componential approach of emotions for user experience tools. In: 2015 International Conference on Affective Computing and Intelligent Interaction (ACII), pp. 772–777 (2015)

    Google Scholar 

  7. Ekman, P., Friesen, W.V.: Facial action coding system. Environ. Psychol. Nonverbal Behav. (1978)

    Google Scholar 

  8. Elvitigala, D.S., Matthies, D.J.C., Nanayakkara, S.: Stressfoot: uncovering the potential of the foot for acute stress sensing in sitting posture. Sensors 20(10), 2882 (2020)

    Article  Google Scholar 

  9. Frank, M.G., Ekman, P., Friesen, W.V.: Behavioral markers and recognizability of the smile of enjoyment. J. Pers. Soc. Psychol. 64(1), 83 (1993)

    Article  Google Scholar 

  10. Gentsch, K., Beermann, U., Wu, L., Trznadel, S., Scherer, K.: Temporal unfolding of micro-valences in facial expression evoked by visual, auditory, and olfactory stimuli. Affect. Sci. 1(4), 208–224 (2020)

    Article  Google Scholar 

  11. Gnacek, M., et al.: EmteqPRO-fully integrated biometric sensing array for non-invasive biomedical research in virtual reality. Front. Virtual Reality 3 (2022)

    Google Scholar 

  12. Granato, M., Gadia, D., Maggiorini, D., Ripamonti, L.A.: An empirical study of players’ emotions in VR racing games based on a dataset of physiological data. Multimedia Tools Appl. 79(45), 33657–33686 (2020)

    Article  Google Scholar 

  13. Grandjean, D., Sander, D., Scherer, K.: Conscious emotional experience emerges as a function of multilevel, appraisal-driven response synchronization. Conscious. Cogn. 17, 484–95 (2008)

    Article  Google Scholar 

  14. Gruebler, A., Berenz, V., Suzuki, K.: Emotionally assisted human-robot interaction using a wearable device for reading facial expressions. Adv. Robot. 26(10), 1143–1159 (2012)

    Article  Google Scholar 

  15. Inzelberg, L., Rand, D., Steinberg, S., David-Pur, M., Hanein, Y.: A wearable high-resolution facial electromyography for long term recordings in freely behaving humans. Sci. Rep. 8(1), 1–9 (2018)

    Article  Google Scholar 

  16. Izard, C.E.: Basic emotions, natural kinds, emotion schemas, and a new paradigm. Perspect. Psychol. Sci. 2(3), 260–280 (2007)

    Article  Google Scholar 

  17. Izard, C.E.: Emotion theory and research: highlights, unanswered questions, and emerging issues. Annu. Rev. Psychol. 60, 1–25 (2009)

    Article  Google Scholar 

  18. Kehri, V., Awale, R.: A facial EMG data analysis for emotion classification based on spectral kurtogram and CNN. Int. J. Digital Signals Smart Syst. 4(1–3), 50–63 (2020)

    Article  Google Scholar 

  19. Kory, J.M., D’Mello, S.K.: Affect elicitation for affective computing. In: The Oxford Handbook of Affective Computing, p. 371 (2014)

    Google Scholar 

  20. Kreibig, S.D.: Autonomic nervous system activity in emotion: a review. Biol. Psychol. 84(3), 394–421 (2010)

    Article  Google Scholar 

  21. Mavridou, I., et al.: FACETEQ interface demo for emotion expression in VR. In: 2017 IEEE Virtual Reality (VR), pp. 441–442 (2017)

    Google Scholar 

  22. Mavridou, I., Seiss, E., Hamedi, M., Balaguer-Ballester, E., Nduka, C.: Towards valence detection from EMG for virtual reality applications. In: 12th International Conference on Disability, Virtual Reality and Associated Technologies (ICDVRAT 2018). ICDVRAT, University of Reading, Reading, UK (2018)

    Google Scholar 

  23. Menétrey, M.: Assessing the Component Process Model of Emotion using multivariate pattern classification analyses. Thesis (2019)

    Google Scholar 

  24. Menétrey, M.Q., Mohammadi, G., Leitão, J., Vuilleumier, P.: Emotion recognition in a multi-componential framework: the role of physiology (2021). https://doi.org/10.1101/2021.04.08.438559

  25. Meuleman, B., Rudrauf, D.: Induction and profiling of strong multi-componential emotions in virtual reality. IEEE Trans. Affect. Comput. 12(1), 189–202 (2018)

    Article  Google Scholar 

  26. Mohammadi, G., Vuilleumier, P.: A multi-componential approach to emotion recognition and the effect of personality. IEEE Trans. Affect. Comput. 13(3), 1127–1139 (2020)

    Article  Google Scholar 

  27. Ojha, S., Vitale, J., Williams, M.A.: Computational emotion models: a thematic review. Int. J. Soc. Robot. 13(6), 1253–1279 (2020)

    Article  Google Scholar 

  28. Perusquía-Hernández, M., Hirokawa, M., Suzuki, K.: Spontaneous and posed smile recognition based on spatial and temporal patterns of facial EMG. In: 2017 Seventh International Conference on Affective Computing and Intelligent Interaction (ACII), pp. 537–541. IEEE (2017)

    Google Scholar 

  29. Picard, R.W.: Affective Computing. MIT Press, Cambridge (2000)

    Book  Google Scholar 

  30. van Reekum, C., Johnstone, T., Banse, R., Etter, A., Wehrle, T., Scherer, K.: Psychophysiological responses to appraisal dimensions in a computer game. Cogn. Emot. 18(5), 663–688 (2004)

    Article  Google Scholar 

  31. Savitzky, A., Golay, M.J.E.: Smoothing and differentiation of data by simplified least squares procedures. Anal. Chem. 36(8), 1627–1639 (1964)

    Article  Google Scholar 

  32. Scherer, K., Dieckmann, A., Unfried, M., Ellgring, H., Mortillaro, M.: Investigating appraisal-driven facial expression and inference in emotion communication. Emotion 21(1), 73 (2019)

    Article  Google Scholar 

  33. Scherer, K.R.: The dynamic architecture of emotion: evidence for the component process model. Cogn. Emot. 23(7), 1307–1351 (2009)

    Article  Google Scholar 

  34. Scherer, K.R.: Emotions are emergent processes: they require a dynamic computational architecture. Philos. Trans. R. Soc. Lond. Ser. B Biol. Sci. 364(1535), 3459–3474 (2009)

    Google Scholar 

  35. Scherer, K.R., Fontaine, J.R.F., Soriano, C.: CoreGRID and MiniGRID: Development and Validation of Two Short Versions of the GRID Instrument. Oxford University Press, Oxford (2013)

    Google Scholar 

  36. Schilbach, L., Eickhoff, S.B., Mojzisch, A., Vogeley, K.: What’s in a smile? Neural correlates of facial embodiment during social interaction. Soc. Neurosci. 3(1), 37–50 (2008)

    Article  Google Scholar 

  37. Shumailov, I., Gunes, H.: Computational analysis of valence and arousal in virtual reality gaming using lower arm electromyograms. In: 2017 Seventh International Conference on Affective Computing and Intelligent Interaction (ACII), pp. 164–169 (2017)

    Google Scholar 

  38. Shuman, V., Schlegel, K., Scherer, K.: Geneva Emotion Wheel Rating Study (2015)

    Google Scholar 

  39. Somarathna, R., Bednarz, T., Mohammadi, G.: An exploratory analysis of interactive VR-based framework for multi-componential analysis of emotion. In: 2022 IEEE International Conference on Pervasive Computing and Communications Workshops and other Affiliated Events (PerCom Workshops), pp. 353–358 (2022)

    Google Scholar 

  40. Somarathna, R., Bednarz, T., Mohammadi, G.: Virtual reality for emotion elicitation - a review. IEEE Trans. Affect. Comput. 1–21 (2022)

    Google Scholar 

  41. Somarathna, R., Bednarz, T., Mohammadi, G.: Multi-componential analysis of emotions using virtual reality. In: Proceedings of the 27th ACM Symposium on Virtual Reality Software and Technology, Article 85. Association for Computing Machinery (2021)

    Google Scholar 

  42. Somarathna, R., Vuilleumier, P., Bednarz, T., Mohammadi, G.: A machine learning model for analyzing the multivariate patterns of emotions in multi-componential framework with personalization. Available at SSRN 4075454

    Google Scholar 

  43. Subramanian, R., Wache, J., Abadi, M.K., Vieriu, R.L., Winkler, S., Sebe, N.: Ascertain: emotion and personality recognition using commercial sensors. IEEE Trans. Affect. Comput. 9(2), 147–160 (2018)

    Article  Google Scholar 

  44. Val-Calvo, M., Álvarez-Sánchez, J.R., Ferrández-Vicente, J.M., Fernández, E.: Affective robot story-telling human-robot interaction: exploratory real-time emotion estimation analysis using facial expressions and physiological signals. IEEE Access 8, 134051–134066 (2020)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Rukshani Somarathna .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Somarathna, R., Quigley, A., Mohammadi, G. (2022). Multi-componential Emotion Recognition in VR Using Physiological Signals. In: Aziz, H., Corrêa, D., French, T. (eds) AI 2022: Advances in Artificial Intelligence. AI 2022. Lecture Notes in Computer Science(), vol 13728. Springer, Cham. https://doi.org/10.1007/978-3-031-22695-3_42

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-22695-3_42

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-22694-6

  • Online ISBN: 978-3-031-22695-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics