Human Emotion Recognition: Review of Sensors and Methods
<p>Russel’s circumplex model of emotions.</p> "> Figure 2
<p>Electroencephalography (EEG) measurements: (<b>a</b>) distribution of EEG electrodes on human scalp [<a href="#B39-sensors-20-00592" class="html-bibr">39</a>]; (<b>b</b>) special headset with installed electrodes [<a href="#B40-sensors-20-00592" class="html-bibr">40</a>].</p> "> Figure 3
<p>EEG signal: (<b>a</b>) example of raw data [<a href="#B43-sensors-20-00592" class="html-bibr">43</a>]; (<b>b</b>) peak to peak signal amplitude evaluation technique [<a href="#B44-sensors-20-00592" class="html-bibr">44</a>].</p> "> Figure 4
<p>Schematic representation of electrocardiography (ECG) [<a href="#B69-sensors-20-00592" class="html-bibr">69</a>]: (<b>a</b>) 12-lead ECG: RA, LA, LL, RL; (<b>b</b>) example of ECG signals.</p> "> Figure 5
<p>ECG procedure [<a href="#B72-sensors-20-00592" class="html-bibr">72</a>]: (<b>a</b>) typical set up; (<b>b</b>) Main parameters of an ECG heartbeat signal.</p> "> Figure 6
<p>Possible places for attaching GSR electrodes [<a href="#B86-sensors-20-00592" class="html-bibr">86</a>].</p> "> Figure 7
<p>Example of raw GSR signal. The blue area indicates the phasic component of the signal; grey area represents the tonic component. The red line indicates the trigger (moment of delivery of the stimulus) [<a href="#B88-sensors-20-00592" class="html-bibr">88</a>].</p> "> Figure 8
<p>Principle of photoplethysmography (PPG) [<a href="#B104-sensors-20-00592" class="html-bibr">104</a>]: (<b>a</b>) reflective mode; (<b>b</b>) transmitting mode; (<b>c</b>) example of PPG signal.</p> "> Figure 9
<p>Comparison between ECG and PPG signals [<a href="#B109-sensors-20-00592" class="html-bibr">109</a>].</p> "> Figure 10
<p>Example of skin temperature change due to applied stimulus [<a href="#B127-sensors-20-00592" class="html-bibr">127</a>].</p> "> Figure 11
<p>Facial electromyography [<a href="#B149-sensors-20-00592" class="html-bibr">149</a>]: location of electrodes.</p> "> Figure 12
<p>Example of EMG electrodes [<a href="#B148-sensors-20-00592" class="html-bibr">148</a>]: (<b>a</b>) needle electrode; (<b>b</b>) fine wire electrode; (<b>c</b>) gelled electrodes; (<b>d</b>) dry electrodes.</p> "> Figure 13
<p>Principle of electrooculography (EOG): (<b>a</b>) electrode placement scheme [<a href="#B160-sensors-20-00592" class="html-bibr">160</a>]; (<b>b</b>) measurement principle [<a href="#B161-sensors-20-00592" class="html-bibr">161</a>].</p> "> Figure 14
<p>Comparison between EOG and EMG signals during three different, sequential actions [<a href="#B162-sensors-20-00592" class="html-bibr">162</a>]: 1—Corrugator supercilii EMG; 2—vertical EOG; 3—horizontal EOG.</p> "> Figure 15
<p>Classification of measurement methods for emotions recognition.</p> ">
Abstract
:1. Introduction
- (i)
- (ii)
- (iii)
- (iv)
- (i)
- “emotion” is a response of the organism to a particular stimulus (person, situation or event). Usually it is an intense, short duration experience and the person is typically well aware of it;
- (ii)
- “affect” is a result of the effect caused by emotion and includes their dynamic interaction;
- (iii)
- “feeling” is always experienced in relation to a particular object of which the person is aware; its duration depends on the length of time that the representation of the object remains active in the person’s mind;
- (iv)
- “mood” tends to be subtler, longer lasting, less intensive, more in the background, but it can affect affective state of a person to positive or negative direction.
2. Emotions Evaluation Methods
2.1. Electroencephalography (EEG)
2.2. Electrocardiography (ECG)
2.3. Galvanic Skin Response (GSR)
- (i)
- it requires fewer measuring electrodes, which allows for the easier use of wearable devices and definition of emotional states when a person engages in normal activities;
- (ii)
- GSR provides fewer raw data, especially if long term monitoring is performed, this allows to analyse obtained data more quickly and does not require a lot of computational power;
- (iii)
- equipment required for GSR measurements is much more simple and cheaper, if special electrodes are available, a measuring device can be assembled using popular and freely available components (ADC converters, microcontrollers, etc.).
2.4. Heart Rate Variability (HRV)
- (i)
- the static part of signal depends on the structure of the tissue and the average blood volume arterial and venous blood, and it varies very slowly depending on respiration;
- (ii)
- the dynamic part represents changes in the blood volume that occurs between the systolic and diastolic phases of the cardiac cycle [104].
2.5. Respiration Rate Analysis (RR)
- manual or semi-automatic breath rate evaluation using simple timers or specialized software applications;
- methods based on measurements of air humidity fluctuation in exhaled air;
- methods based on measurements of temperature fluctuation in exhaled air;
- measurements based on definition of air pressure variation due to respiration;
- methods based on measurements of variation of carbon dioxide concentration;
- measurements of variation of oxygen concentration;
- methods based on measurements of body movements;
- methods based on measurements of respiratory sounds.
2.6. Skin Temperature Measurements (SKT)
2.7. Electromyogram (EMG)
2.8. Electrooculography (EOG)
2.9. Facial Expresions (FE) Body Posture (BP) and Gesture Analysis (GA)
- (i)
- it recognizes only strong emotions which last some amount of time, response to weak emotions or to very short not intense stimulus does not create the noticeable facial movements or change in body posture;
- (ii)
- the possibility exists that changes in human motion or facial expressions are due to environmental effects.
- (i)
- huge amount of data is created while tracking a lot of reference points;
- (ii)
- track of body posture: it is difficult to define the exact position of a reference point, which is covered by clothes, and in this case, special marks for vision systems should be implemented.
3. Signal Analysis and Features Extraction Methods
4. Discussion
5. Conclusions and Future Trends
Author Contributions
Funding
Conflicts of Interest
References
- Rattanyu, K.; Ohkura, M.; Mizukawa, M. Emotion Monitoring from Physiological Signals for Service Robots in the Living Space. In Proceedings of the ICCAS 2010, Gyeonggi-do, Korea, 27–30 October 2010; pp. 580–583. [Google Scholar]
- Byron, K.; Terranova, S.; Nowicki, S. Nonverbal Emotion Recognition and Salespersons: Linking Ability to Perceived and Actual Success. J. Appl. Soc. Psychol. 2007, 37, 2600–2619. [Google Scholar] [CrossRef]
- Feidakis, M.; Daradoumis, T.; Caballe, S. Emotion Measurement in Intelligent Tutoring Systems: What, When and How to Measure. In Proceedings of the 2011 Third International Conference on Intelligent Networking and Collaborative Systems, IEEE, Fukuoka, Japan, 30 November–2 December 2011; pp. 807–812. [Google Scholar]
- Mandryk, R.L.; Atkins, M.S.; Inkpen, K.M. A continuous and objective evaluation of emotional experience with interactive play environments. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’06), Montréal, QC, Canada, 22–27 April 2006; ACM Press: New York, NY, USA, 2006; p. 1027. [Google Scholar]
- Sosnowski, S.; Bittermann, A.; Kuhnlenz, K.; Buss, M. Design and Evaluation of Emotion-Display EDDIE. In Proceedings of the 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems, Beijing, China, 9–15 October 2006; pp. 3113–3118. [Google Scholar]
- Ogata, T.; Sugano, S. Emotional communication between humans and the autonomous robot which has the emotion model. In Proceedings of the 1999 IEEE International Conference on Robotics and Automation (Cat. No.99CH36288C), Detroit, MI, USA, 10–15 May 1999; Volume 4, pp. 3177–3182. [Google Scholar]
- Malfaz, M.; Salichs, M.A. A new architecture for autonomous robots based on emotions. IFAC 2004, 37, 805–809. [Google Scholar] [CrossRef]
- Delkhoon, M.A.; Lotfizadeh, F. An Investigation on the Effect of Gender on Emotional Responses and Purchasing Intention Due to Advertisements. UCT J. Soc. Sci. Humanit. Res. 2014, 2, 6–11. [Google Scholar]
- Singh, J.; Goyal, G.; Gill, R. Use of neurometrics to choose optimal advertisement method for omnichannel business. Enterp. Inf. Syst. 2019, 1–23. [Google Scholar] [CrossRef]
- Chung, W.J.; Patwa, P.; Markov, M.M. Targeting Advertisements Based on Emotion. U.S. Patent Application No 12/958,775, 7 June 2012. [Google Scholar]
- D’Mello, S.K.; Craig, S.D.; Gholson, B.; Franklin, S.; Picard, R.W.; Graesser, A.C. Integrating Affect Sensors in an Intelligent Tutoring System. In Proceedings of the 2005 International Conference on Intelligent User Interfaces, San Diego, CA, USA, 10–13 January 2005. [Google Scholar]
- Woolf, B.P.; Arroyo, I.; Cooper, D.; Burleson, W.; Muldner, K. Affective Tutors: Automatic Detection of and Response to Student Emotion; Springer: Berlin/Heidelberg, Germany, 2010; pp. 207–227. [Google Scholar]
- Scotti, S.; Mauri, M.; Barbieri, R.; Jawad, B.; Cerutti, S.; Mainardi, L.; Brown, E.N.; Villamira, M.A. Automatic Quantitative Evaluation of Emotions in E-learning Applications. In Proceedings of the 2006 International Conference of the IEEE Engineering in Medicine and Biology Society, New York, NY, USA, 30 August–3 Septemebr2006; pp. 1359–1362. [Google Scholar]
- Kolakowska, A.; Landowska, A.; Szwoch, M.; Szwoch, W.; Wrobel, M.R. Emotion recognition and its application in software engineering. In Proceedings of the 2013 6th International Conference on Human System Interactions (HSI), Gdansk, Sopot, Poland, 6–8 June 2013; pp. 532–539. [Google Scholar]
- Guo, F.; Liu, W.L.; Cao, Y.; Liu, F.T.; Li, M.L. Optimization Design of a Webpage Based on Kansei Engineering. Hum. Factors Ergon. Manuf. Serv. Ind. 2016, 26, 110–126. [Google Scholar] [CrossRef]
- Yannakakis, G.N.; Hallam, J. Real-Time Game Adaptation for Optimizing Player Satisfaction. IEEE Trans. Comput. Intell. AI Games 2009, 1, 121–133. [Google Scholar] [CrossRef] [Green Version]
- Fleureau, J.; Guillotel, P.; Huynh-Thu, Q. Physiological-Based Affect Event Detector for Entertainment Video Applications. IEEE Trans. Affect. Comput. 2012, 3, 379–385. [Google Scholar] [CrossRef]
- Oatley, K.; Johnson-laird, P.N. Towards a Cognitive Theory of Emotions. Cognit. Emot. 1987, 1, 29–50. [Google Scholar] [CrossRef]
- Von Scheve, C.; Ismer, S. Towards a Theory of Collective Emotions. Emot. Rev. 2013, 5, 406–413. [Google Scholar] [CrossRef]
- Gray, J.A. On the classification of the emotions. Behav. Brain Sci. 1982, 5, 431–432. [Google Scholar] [CrossRef]
- Feidakis, M.; Daradoumis, T.; Caballe, S. Endowing e-Learning Systems with Emotion Awareness. In Proceedings of the 2011 Third International Conference on Intelligent Networking and Collaborative Systems, Fukuoka, Japan, 30 November–2 December 2011; pp. 68–75. [Google Scholar]
- Université de Montréal; Presses de l’Université de Montréal. Interaction of Emotion and Cognition in the Processing of Textual Materia; Presses de l’Université de Montréal: Québec, QC, Canada, 1966; Volume 52. [Google Scholar]
- Russell, J.A. A circumplex model of affect. J. Pers. Soc. Psychol. 1980, 39, 1161–1178. [Google Scholar] [CrossRef]
- Csikszentmihalyi, M. Flow and the Foundations of Positive Psychology: The collected works of Mihaly Csikszentmihalyi; Springer: Dordrecht, The Netherlands, 2014; ISBN 9401790884. [Google Scholar]
- Kaklauskas, A. Biometric and Intelligent Decision Making Support; Springer: Cham, Switzerland, 2015; Volume 81, ISBN 978-3-319-13658-5. [Google Scholar]
- Kaklauskas, A.; Kuzminske, A.; Zavadskas, E.K.; Daniunas, A.; Kaklauskas, G.; Seniut, M.; Raistenskis, J.; Safonov, A.; Kliukas, R.; Juozapaitis, A.; et al. Affective tutoring system for built environment management. Comput. Educ. 2015, 82, 202–216. [Google Scholar] [CrossRef]
- Kaklauskas, A.; Jokubauskas, D.; Cerkauskas, J.; Dzemyda, G.; Ubarte, I.; Skirmantas, D.; Podviezko, A.; Simkute, I. Affective analytics of demonstration sites. Eng. Appl. Artif. Intell. 2019, 81, 346–372. [Google Scholar] [CrossRef]
- Kaklauskas, A.; Zavadskas, E.K.; Bardauskiene, D.; Cerkauskas, J.; Ubarte, I.; Seniut, M.; Dzemyda, G.; Kaklauskaite, M.; Vinogradova, I.; Velykorusova, A. An Affect-Based Built Environment Video Analytics. Autom. Constr. 2019, 106, 102888. [Google Scholar] [CrossRef]
- Emotion-Sensing Technology in the Internet of Things. Available online: https://onix-systems.com/blog/emotion-sensing-technology-in-the-internet-of-things (accessed on 30 December 2019).
- Wallbott, H.G.; Scherer, K.R. Assesing emotion by questionnaire. In The Measurement of Emotions; Academic Press: Cambridge, MA, USA, 1989; pp. 55–82. ISBN 9780125587044. [Google Scholar]
- Becker, A.; Hagenberg, N.; Roessner, V.; Woerner, W.; Rothenberger, A. Evaluation of the self-reported SDQ in a clinical setting: Do self-reports tell us more than ratings by adult informants? Eur. Child. Adolesc. Psychiatry 2004, 13, 17–24. [Google Scholar] [CrossRef]
- Isomursu, M.; Tähti, M.; Väinämö, S.; Kuutti, K. Experimental evaluation of five methods for collecting emotions in field settings with mobile applications. Int. J. Hum. Comput. Stud. 2007, 65, 404–418. [Google Scholar] [CrossRef]
- Mahlke, S.; Minge, M.; Thüring, M. Measuring multiple components of emotions in interactive contexts. In CHI ‘06 Extended Abstracts on Human Factors in Computing Systems-CHI EA ‘06; ACM Press: New York, NY, USA, 2006; p. 1061. [Google Scholar]
- Liapis, A.; Katsanos, C.; Sotiropoulos, D.; Xenos, M.; Karousos, N. Recognizing Emotions in Human Computer Interaction: Studying Stress Using Skin Conductance; Springer: Cham, Switzerland, 2015; pp. 255–262. [Google Scholar]
- Camurri, A.; Lagerlöf, I.; Volpe, G. Recognizing emotion from dance movement: Comparison of spectator recognition and automated techniques. Int. J. Hum. Comput. Stud. 2003, 59, 213–225. [Google Scholar] [CrossRef]
- Scherer, K.R. What are emotions? And how can they be measured? Soc. Sci. Inf. 2005, 44, 695–729. [Google Scholar] [CrossRef]
- Gonçalves, V.P.; Giancristofaro, G.T.; Filho, G.P.R.; Johnson, T.; Carvalho, V.; Pessin, G.; de Almeida Neris, V.P.; Ueyama, J. Assessing users’ emotion at interaction time: a multimodal approach with multiple sensors. Soft Comput. 2017, 21, 5309–5323. [Google Scholar] [CrossRef]
- St. Louis, E.K.; Frey, L.C.; Britton, J.W.; Frey, L.C.; Hopp, J.L.; Korb, P.; Koubeissi, M.Z.; Lievens, W.E.; Pestana-Knight, E.M.; St. Louis, E.K. Electroencephalography (EEG): An Introductory Text and Atlas of Normal and Abnormal Findings in Adults, Children, and Infants; American Epilepsy Society: Chicago, IL, USA, 2016; ISBN 9780997975604. [Google Scholar]
- Aminoff, M.J. Electroencephalography: General principles and clinical applications. In Aminoff’s Electrodiagnosis in Clinical Neurology; Saunders, W.B., Ed.; Elsevier B.V.: Amsterdam, The Netherlands, 2012; pp. 37–84. ISBN 9781455703081. [Google Scholar]
- Hope, C. “Volunteer Duty” Psychology Testing|Photo by Chris Hope AS.| Flickr. Available online: https://www.flickr.com/photos/tim_uk/8135755109/in/photostream/ (accessed on 27 December 2019).
- EEG: Electroencephalography—iMotions Software and EEG Headsets. Available online: https://imotions.com/biosensor/electroencephalography-eeg/ (accessed on 29 October 2019).
- Electroencephalography | Definition, Procedure, & Uses | Britannica.com. Available online: https://www.britannica.com/science/electroencephalography (accessed on 29 October 2019).
- B Bajaj, V.; Pachori, R.B. EEG Signal Classification Using Empirical Mode Decomposition and Support Vector Machine. In Proceedings of the International Conference on Soft Computing for Problem Solving (SocProS 2011) December 20–22, 2011; Springer: New Delhi, India; 2012; pp. 623–635. [Google Scholar]
- Oikonomou, V.P.; Tzallas, A.T.; Fotiadis, D.I. A Kalman filter based methodology for EEG spike enhancement. Comput. Methods Programs Biomed. 2007, 85, 101–108. [Google Scholar] [CrossRef] [PubMed]
- Kaur, B.; Singh, D.; Roy, P.P. EEG Based Emotion Classification Mechanism in BCI. In Proceedings of the Procedia Computer Science, Sanur, Bali, Indonesia, 17–19 April 2018. [Google Scholar]
- Pagani, C. Violence and Complexity. Open Psychol. J. 2015, 8, 11–16. [Google Scholar] [CrossRef] [Green Version]
- Wan Ismail, W.O.A.S.; Hanif, M.; Mohamed, S.B.; Hamzah, N.; Rizman, Z.I. Human Emotion Detection via Brain Waves Study by Using Electroencephalogram (EEG). Int. J. Adv. Sci. Eng. Inf. Technol. 2016, 6, 1005. [Google Scholar] [CrossRef] [Green Version]
- Shakshi, R.J. Brain Wave Classification and Feature Extraction of EEG Signal by Using FFT on Lab View. Int. Res. J. Eng. Technol. 2016, 3, 1208–1212. [Google Scholar]
- EEG-Event Related Potentials. Available online: http://www.medicine.mcgill.ca/physio/vlab/biomed_signals/eeg_erp.htm (accessed on 3 November 2019).
- Vijayan, A.E.; Sen, D.; Sudheer, A.P. EEG-Based Emotion Recognition Using Statistical Measures and Auto-Regressive Modeling. In Proceedings of the 2015 IEEE International Conference on Computational Intelligence & Communication Technology, Riga, Latvia, 3–5 June 2015; pp. 587–591. [Google Scholar]
- Dissanayake, T.; Rajapaksha, Y.; Ragel, R.; Nawinne, I. An Ensemble Learning Approach for Electrocardiogram Sensor Based Human Emotion Recognition. Sensors 2019, 19, 4495. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Nakisa, B.; Rastgoo, M.N.; Tjondronegoro, D.; Chandran, V. Evolutionary computation algorithms for feature selection of EEG-based emotion recognition using mobile sensors. Expert Syst. Appl. 2018, 93, 143–155. [Google Scholar] [CrossRef] [Green Version]
- Liu, Y.-H.; Wu, C.-T.; Cheng, W.-T.; Hsiao, Y.-T.; Chen, P.-M.; Teng, J.-T. Emotion Recognition from Single-Trial EEG Based on Kernel Fisher’s Emotion Pattern and Imbalanced Quasiconformal Kernel Support Vector Machine. Sensors 2014, 14, 13361–13388. [Google Scholar] [CrossRef] [Green Version]
- Zhang, J.; Chen, M.; Zhao, S.; Hu, S.; Shi, Z.; Cao, Y. ReliefF-Based EEG Sensor Selection Methods for Emotion Recognition. Sensors 2016, 16, 1558. [Google Scholar] [CrossRef]
- Mehmood, R.; Lee, H. Towards Building a Computer Aided Education System for Special Students Using Wearable Sensor Technologies. Sensors 2017, 17, 317. [Google Scholar] [CrossRef]
- Purnamasari, P.; Ratna, A.; Kusumoputro, B. Development of Filtered Bispectrum for EEG Signal Feature Extraction in Automatic Emotion Recognition Using Artificial Neural Networks. Algorithms 2017, 10, 63. [Google Scholar] [CrossRef]
- Li, Y.; Huang, J.; Zhou, H.; Zhong, N. Human Emotion Recognition with Electroencephalographic Multidimensional Features by Hybrid Deep Neural Networks. Appl. Sci. 2017, 7, 1060. [Google Scholar] [CrossRef] [Green Version]
- Alazrai, R.; Homoud, R.; Alwanni, H.; Daoud, M. EEG-Based Emotion Recognition Using Quadratic Time-Frequency Distribution. Sensors 2018, 18, 2739. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Chao, H.; Dong, L.; Liu, Y.; Lu, B. Emotion Recognition from Multiband EEG Signals Using CapsNet. Sensors 2019, 19, 2212. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Cai, J.; Chen, W.; Yin, Z. Multiple Transferable Recursive Feature Elimination Technique for Emotion Recognition Based on EEG Signals. Symmetry 2019, 11, 683. [Google Scholar] [CrossRef] [Green Version]
- Gao, Z.; Cui, X.; Wan, W.; Gu, Z. Recognition of Emotional States using Multiscale Information Analysis of High Frequency EEG Oscillations. Entropy 2019, 21, 609. [Google Scholar] [CrossRef] [Green Version]
- Koelstra, S.; Muhl, C.; Soleymani, M.; Lee, J.-S.; Yazdani, A.; Ebrahimi, T.; Pun, T.; Nijholt, A.; Patras, I. DEAP: A Database for Emotion Analysis;Using Physiological Signals. IEEE Trans. Affect. Comput. 2012, 3, 18–31. [Google Scholar] [CrossRef] [Green Version]
- Soleymani, M.; Lichtenauer, J.; Pun, T.; Pantic, M. A Multimodal Database for Affect Recognition and Implicit Tagging. IEEE Trans. Affect. Comput. 2012, 3, 42–55. [Google Scholar] [CrossRef] [Green Version]
- Carvalho, S.; Leite, J.; Galdo-Álvarez, S.; Gonçalves, Ó.F. The Emotional Movie Database (EMDB): A Self-Report and Psychophysiological Study. Appl. Psychophysiol. Biofeedback 2012, 37, 279–294. [Google Scholar] [CrossRef] [Green Version]
- Abadi, M.K.; Subramanian, R.; Kia, S.M.; Avesani, P.; Patras, I.; Sebe, N. DECAF: MEG-Based Multimodal Database for Decoding Affective Physiological Responses. IEEE Trans. Affect. Comput. 2015, 6, 209–222. [Google Scholar] [CrossRef]
- Schaekermann, M. Biosignal Datasets for Emotion Recognition. Available online: http://hcigames.com/hci/biosignal-datasets-emotion-recognition/ (accessed on 5 November 2019).
- International Neural Network Society; Verband der Elektrotechnik; Institute of Electrical and Electronics Engineers. ANNA ’18: Advances in Neural Networks and Applications 2018 September 15–17, 2018, St. St. Konstantin and Elena Resort, Bulgaria; Vde Verlag GmbH: Berlin, Germany, 2018; ISBN 9783800747566. [Google Scholar]
- Goshvarpour, A.; Abbasi, A.; Goshvarpour, A. An Emotion Recognition Approach Based on Wavelet Transform and Second-Order Difference Plot of ECG. J. AI Data Min. 2017, 5, 211–221. [Google Scholar]
- Al Khatib, I.; Bertozzi, D.; Poletti, F.; Benini, L.; Jantsch, A.; Bechara, M.; Khalifeh, H.; Hajjar, M.; Nabiev, R.; Jonsson, S. Hardware/software architecture for real-time ECG monitoring and analysis leveraging MPSoC technology. In Transactions on High-Performance Embedded Architectures and Compilers I; Lecture Notes in Computer Science; Springer: Berlin/Heidelberg, Germany, 2007; pp. 239–258. [Google Scholar]
- Paithane, A.N.; Bormane, D.S.; Dinde, S. Human Emotion Recognition using Electrocardiogram Signals. Int. J. Recent Innov. Trends Comput. Commun. 2014, 2, 194–197. [Google Scholar]
- Amri, M.F.; Rizqyawan, M.I.; Turnip, A. ECG signal processing using offline-wavelet transform method based on ECG-IoT device. In Proceedings of the 2016 3rd International Conference on Information Technology, Computer and Electrical Engineering, Semarang, Indonesia, 18–20 October 2016; pp. 25–30. [Google Scholar]
- ECG Setup—Wikimedia Commons. Available online: https://commons.wikimedia.org/wiki/File:Ekg_NIH.jpg (accessed on 28 December 2019).
- Cai, J.; Liu, G.; Hao, M. The Research on Emotion Recognition from ECG Signal. In Proceedings of the 2009 International Conference on Information Technology and Computer Science, Kiev, Ukraine, 25–26 July 2009; pp. 497–500. [Google Scholar]
- Uyarel, H.; Okmen, E.; Cobanoǧlu, N.; Karabulut, A.; Cam, N. Effects of anxiety on QT dispersion in healthy young men. Acta Cardiol. 2006, 61, 83–87. [Google Scholar] [CrossRef] [PubMed]
- Abdul Jamil, M.M.; Soon, C.F.; Achilleos, A.; Youseffi, M.; Javid, F. Electrocardiograph (ECG) circuit design and software-based processing using LabVIEW. J. Telecommun. Electron. Comput. Eng. 2017, 9, 57–66. [Google Scholar]
- Nikolova, D.; Petkova, P.; Manolova, A.; Georgieva, P. ECG-based Emotion Recognition: Overview of Methods and Applications. In Proceedings of the ANNA ’18 Advances in Neural Networks and Applications 2018, St. Konstantin and Elena Resort, Bulgaria, 15–17 September 2018; pp. 118–122. [Google Scholar]
- Marín-Morales, J.; Higuera-Trujillo, J.L.; Greco, A.; Guixeres, J.; Llinares, C.; Scilingo, E.P.; Alcañiz, M.; Valenza, G. Affective computing in virtual reality: emotion recognition from brain and heartbeat dynamics using wearable sensors. Sci. Rep. 2018, 8, 13657. [Google Scholar] [CrossRef] [PubMed]
- Udovičić, G.; Derek, J.; Russo, M.; Sikora, M. Wearable Emotion Recognition system based on GSR and PPG signals. In Proceedings of the 2nd International Workshop on Multimedia for Personal Health and Health Care, Mountain View, CA, USA, 23 October 2017; pp. 53–59. [Google Scholar]
- Wu, G.; Liu, G.; Hao, M. The Analysis of Emotion Recognition from GSR Based on PSO. In Proceedings of the 2010 International Symposium on Intelligence Information Processing and Trusted Computing, Wuhan, China, 28–29 October 2010; pp. 360–363. [Google Scholar]
- Lidberg, L.; Wallin, B.G. Sympathetic Skin Nerve Discharges in Relation to Amplitude of Skin Resistance Responses. Psychophysiology 1981, 18, 268–270. [Google Scholar] [CrossRef]
- Ayata, D.; Yaslan, Y.; Kamasak, M. Emotion recognition via galvanic skin response: Comparison of machine learning algorithms and feature extraction methods. Istanbul Univ. J. Electr. Electron. Eng. 2017, 17, 3129–3136. [Google Scholar]
- Critchley, H.D. Review: Electrodermal Responses: What Happens in the Brain. Neurosci 2002, 8, 132–142. [Google Scholar] [CrossRef]
- Lang, P.J.; Greenwald, M.K.; Bradley, M.M.; Hamm, A.O. Looking at pictures: Affective, facial, visceral, and behavioral reactions. Psychophysiology 1993, 30, 261–273. [Google Scholar] [CrossRef]
- Duda, S.; Hawkins, D.; McGill, M. Physiological Response Measurements. Eye Track. User Exp. Des. 2014, 81–108. [Google Scholar] [CrossRef]
- Boucsein, W.; Fowles, D.C.; Grimnes, S.; Ben-Shakhar, G.; Roth, W.T.; Dawson, M.E.; Filion, D.L.; Society for Psychophysiological Research Ad Hoc Committee on Electrodermal Measures. Publication recommendations for electrodermal measurements. Psychophysiology 2012, 49, 1017–1034. [Google Scholar]
- Van Dooren, M.; de Vries, J.J.; Janssen, J.H. Emotional sweating across the body: Comparing 16 different skin conductance measurement locations. Physiol. Behav. 2012, 106, 298–304. [Google Scholar] [CrossRef] [PubMed]
- Neuro-Tools: GSR|Acuity Eyetracking Blog. Available online: https://acuityets.wordpress.com/2016/10/24/series-neuro-tools-gsr/ (accessed on 8 November 2019).
- Gatti, E.; Calzolari, E.; Maggioni, E.; Obrist, M. Emotional ratings and skin conductance response to visual, auditory and haptic stimuli. Sci. Data 2018, 5, 180120. [Google Scholar] [CrossRef] [PubMed]
- Greco, A.; Lanata, A.; Citi, L.; Vanello, N.; Valenza, G.; Scilingo, E. Skin Admittance Measurement for Emotion Recognition: A Study over Frequency Sweep. Electronics 2016, 5, 46. [Google Scholar] [CrossRef] [Green Version]
- Villon, O.; Lisetti, C. Toward Recognizing Individual’s Subjective Emotion from Physiological Signals in Practical Application. In Proceedings of the Twentieth IEEE International Symposium on Computer-Based Medical Systems (CBMS’07), Maribor, Slovenia, 20–22 June 2007; pp. 357–362. [Google Scholar]
- Chanel, G.; Kierkels, J.J.M.; Soleymani, M.; Pun, T. Short-term emotion assessment in a recall paradigm. Int. J. Hum. Comput. Stud. 2009, 67, 607–627. [Google Scholar] [CrossRef]
- Chanel, G.; Kronegg, J.; Grandjean, D.; Pun, T. Emotion Assessment: Arousal Evaluation Using EEG’s and Peripheral Physiological Signals. In Proceedings of the International Workshop on Multimedia Content Representation, Classification And Security; Istanbul, Turkey, 11–13 September 2006, Springer: Berlin/Heidelberg, Germany, 2006; Volume 4105, pp. 530–537. [Google Scholar]
- Peter, C.; Ebert, E.; Beikirch, H. A Wearable Multi-sensor System for Mobile Acquisition of Emotion-Related Physiological Data; Springer: Berlin/Heidelberg, Germany, 2005; pp. 691–698. [Google Scholar]
- Villon, O.; Lisetti, C. A User-Modeling Approach to Build User’s Psycho-Physiological Maps of Emotions using Bio-Sensors. In Proceedings of the ROMAN 2006–The 15th IEEE International Symposium on Robot and Human Interactive Communication, Herthfordshire, UK, 6–8 September 2006; pp. 269–276. [Google Scholar]
- Sungwon, L.; Choong-Seon, H.; Yong Kwi, L. Hyun-soon Shin Experimental emotion recognition system and services for mobile network environments. In Proceedings of the 2010 IEEE Sensors, Limerick, Ireland, 1–4 November 2010; pp. 136–140. [Google Scholar]
- Sierra, A.D.S.; Ávila, C.S.; Casanova, J.G.; Bailador, G. Real-Time Stress Detection by Means of Physiological Signals. In Advanced Biometric Technologies; IntechOpen: London, UK, 2011; pp. 23–44. [Google Scholar]
- Hsieh, P.-Y.; Chin, C.-L. The emotion recognition system with Heart Rate Variability and facial image features. In Proceedings of the 2011 IEEE International Conference on Fuzzy Systems (FUZZ-IEEE 2011), San Diego, CA, USA, 8–12 March 2011; pp. 1933–1940. [Google Scholar]
- Huang, C.; Liew, S.S.; Lin, G.R.; Poulsen, A.; Ang, M.J.Y.; Chia, B.C.S.; Chew, S.Y.; Kwek, Z.P.; Wee, J.L.K.; Ong, E.H.; et al. Discovery of Irreversible Inhibitors Targeting Histone Methyltransferase, SMYD3. ACS Med. Chem. Lett. 2019, 10, 978–984. [Google Scholar] [CrossRef]
- Benezeth, Y.; Li, P.; Macwan, R.; Nakamura, K.; Yang, F.; Benezeth, Y.; Li, P.; Macwan, R.; Nakamura, K.; Gomez, R.; et al. Remote Heart Rate Variability for Emotional State Monitoring. In Proceedings of the 2018 IEEE EMBS International Conference on Biomedical & Health Informatics (BHI), Las Vegas, NV, USA, 4–7 March 2018; pp. 153–156. [Google Scholar]
- Andreas, H.; Silke, G.; Peter, S.J.W. Emotion Recognition Using Bio-Sensors: First Steps Towards an Automatic System. In Affective Dialogue Systems; Lecture Notes in Computer Science; Springer: Berlin/Heidelberg, Geramny, 2004. [Google Scholar]
- Mikuckas, A.; Mikuckiene, I.; Venckauskas, A.; Kazanavicius, E.; Lukas, R.; Plauska, I. Emotion recognition in human computer interaction systems. Elektron. Elektrotech. 2014, 20, 51–56. [Google Scholar] [CrossRef]
- Zhu, J.; Ji, L.; Liu, C. Heart rate variability monitoring for emotion and disorders of emotion. Physiol. Meas. 2019, 40, 064004. [Google Scholar] [CrossRef]
- Markovics, Z.; Lauznis, J.; Erins, M.; Minejeva, O.; Kivlenieks, R. Testing and Analysis of the HRV Signals from Wearable Smart HRV Sensors. Int. J. Eng. Technol. 2018, 7, 1211. [Google Scholar] [CrossRef]
- Tamura, T.; Maeda, Y.; Sekine, M.; Yoshida, M. Wearable Photoplethysmographic Sensors—Past and Present. Electronics 2014, 3, 282–302. [Google Scholar] [CrossRef]
- Allen, J. Photoplethysmography and its application in clinical physiological measurement. Physiol. Meas. 2007, 28, R1–R39. [Google Scholar] [CrossRef] [Green Version]
- Jeyhani, V.; Mahdiani, S.; Peltokangas, M.; Vehkaoja, A. Comparison of HRV parameters derived from photoplethysmography and electrocardiography signals. In Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Milano, Italy, 25–29 August 2015; pp. 5952–5955. [Google Scholar]
- Choi, K.-H.; Kim, J.; Kwon, O.S.; Kim, M.J.; Ryu, Y.H.; Park, J.-E. Is heart rate variability (HRV) an adequate tool for evaluating human emotions?–A focus on the use of the International Affective Picture System (IAPS). Psychiatry Res. 2017, 251, 192–196. [Google Scholar] [CrossRef] [PubMed]
- Maritsch, M.; Bérubé, C.; Kraus, M.; Lehmann, V.; Züger, T.; Feuerriegel, S.; Kowatsch, T.; Wortmann, F. Improving Heart Rate Variability Measurements from consumer Smartwatches with Machine Learning. In Proceedings of the UbiComp ’19: The 2019 ACM International Joint Conference on Pervasive and Ubiquitous Computing, London, UK, 9–13 September 2019; Association for Computing Machinery: New York, NY, USA, 2019; pp. 934–938. [Google Scholar]
- Elgendi, M.; Fletcher, R.; Liang, Y.; Howard, N.; Lovell, N.H.; Abbott, D.; Lim, K.; Ward, R. The use of photoplethysmography for assessing hypertension. NPJ Digit. Med. 2019, 2, 60. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Xiefeng, C.; Wang, Y.; Dai, S.; Zhao, P.; Liu, Q. Heart sound signals can be used for emotion recognition. Sci. Rep. 2019, 9, 6486. [Google Scholar] [CrossRef] [PubMed]
- Boric-Lubecke, O.; Massagram, W.; Lubecke, V.M.; Host-Madsen, A.; Jokanovic, B. Heart Rate Variability Assessment Using Doppler Radar with Linear Demodulation. In Proceedings of the 2008 38th European Microwave Conference, Amsterdam, The Netherlands, 28–30 October 2008; pp. 420–423. [Google Scholar]
- Chanel, G.; Ansari-Asl, K. Thierry Pun Valence-arousal evaluation using physiological signals in an emotion recall paradigm. In Proceedings of the 2007 IEEE International Conference on Systems, Man and Cybernetics, Quebec, QC, Canada, 7–10 October 2007; pp. 2662–2667. [Google Scholar]
- Park, M.W.; Kim, C.J.; Hwang, M.; Lee, E.C. Individual Emotion Classification between Happiness and Sadness by Analyzing Photoplethysmography and Skin Temperature. In Proceedings of the 2013 Fourth World Congress on Software Engineering, Hong Kong, China, 3–4 December 2013; pp. 190–194. [Google Scholar]
- Quazi, M.T.; Mukhopadhyay, S.C.; Suryadevara, N.K.; Huang, Y.M. Towards the smart sensors based human emotion recognition. In Proceedings of the 2012 IEEE International Instrumentation and Measurement Technology Conference Proceedings, Graz, Austria, 13–16 May 2012; pp. 2365–2370. [Google Scholar]
- Choi, J.; Ahmed, B.; Gutierrez-Osuna, R. Development and Evaluation of an Ambulatory Stress Monitor Based on Wearable Sensors. IEEE Trans. Inf. Technol. Biomed. 2012, 16, 279–286. [Google Scholar] [CrossRef] [Green Version]
- Lee, M.S.; Lee, Y.K.; Pae, D.S.; Lim, M.T.; Kim, D.W.; Kang, T.K. Fast Emotion Recognition Based on Single Pulse PPG Signal with Convolutional Neural Network. Appl. Sci. 2019, 9, 3355. [Google Scholar] [CrossRef] [Green Version]
- Hui, T.K.L.; Sherratt, R.S. Coverage of Emotion Recognition for Common Wearable Biosensors. Biosensors 2018, 8, 30. [Google Scholar]
- Zhang, Q.; Chen, X.; Zhan, Q.; Yang, T.; Xia, S. Respiration-based emotion recognition with deep learning. Comput. Ind. 2017, 92–93, 84–90. [Google Scholar] [CrossRef]
- Ginsburg, A.S.; Lenahan, J.L.; Izadnegahdar, R.; Ansermino, J.M. A Systematic Review of Tools to Measure Respiratory Rate in Order to Identify Childhood Pneumonia. Am. J. Respir. Crit. Care Med. 2018, 197, 1116–1127. [Google Scholar] [CrossRef]
- Liu, H.; Allen, J.; Zheng, D.; Chen, F. Recent development of respiratory rate measurement technologies. Physiol. Meas. 2019, 40, 07TR01. [Google Scholar] [CrossRef] [Green Version]
- Takahashi, K.; Namikawa, S.; Hashimoto, M. Computational emotion recognition using multimodal physiological signals: Elicited using Japanese kanji words. In Proceedings of the 2012 35th International Conference on Telecommunications and Signal Processing (TSP), Prague, Czech Republic, 3–4 July 2012; pp. 615–620. [Google Scholar]
- Katsis, C.D.; Katertsidis, N.; Ganiatsas, G.; Fotiadis, D.I. Toward Emotion Recognition in Car-Racing Drivers: A Biosignal Processing Approach. IEEE Trans. Syst. Man Cybern. Part A Syst. Hum. 2008, 38, 502–512. [Google Scholar] [CrossRef]
- Nhan, B.R.; Chau, T. Classifying Affective States Using Thermal Infrared Imaging of the Human Face. IEEE Trans. Biomed. Eng. 2010, 57, 979–987. [Google Scholar] [CrossRef] [PubMed]
- Landowska, A. Emotion Monitoring—Verification of Physiological Characteristics Measurement Procedures. Metrol. Meas. Syst. 2014, 21, 719–732. [Google Scholar] [CrossRef] [Green Version]
- Valderas, M.T.; Bolea, J.; Laguna, P.; Vallverdu, M.; Bailon, R. Human emotion recognition using heart rate variability analysis with spectral bands based on respiration. In Proceedings of the 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Milano, Italy, 25–29 August 2015; pp. 6134–6137. [Google Scholar]
- Healey, J.A.; Picard, R.W. Detecting stress during real-world driving tasks using physiological sensors. IEEE Trans. Intell. Transp. Syst. 2005, 6, 156–166. [Google Scholar] [CrossRef] [Green Version]
- Kosonogov, V.; De Zorzi, L.; Honoré, J.; Martínez-Velázquez, E.S.; Nandrino, J.-L.; Martinez-Selva, J.M.; Sequeira, H. Facial thermal variations: A new marker of emotional arousal. PLoS ONE 2017, 12, e0183592. [Google Scholar] [CrossRef] [PubMed]
- Krumova, E.K.; Frettlöh, J.; Klauenberg, S.; Richter, H.; Wasner, G.; Maier, C. Long-term skin temperature measurements – A practical diagnostic tool in complex regional pain syndrome. Pain 2008, 140, 8–22. [Google Scholar] [CrossRef]
- American Psychosomatic Society; National Research Council (U.S.); Committee on Problems of Neurotic Behavior; American Society for Research in Psychosomatic Problems. Psychosom. Medicine; Elsevier: Amsterdam, The Netherlands, 1943; Volume 5. [Google Scholar]
- Vos, P.; De Cock, P.; Munde, V.; Petry, K.; Van Den Noortgate, W.; Maes, B. The tell-tale: What do heart rate; skin temperature and skin conductance reveal about emotions of people with severe and profound intellectual disabilities? Res. Dev. Disabil. 2012, 33, 1117–1127. [Google Scholar] [CrossRef] [PubMed]
- Okada, S.; Hori, N.; Kimoto, K.; Onozuka, M.; Sato, S.; Sasaguri, K. Effects of biting on elevation of blood pressure and other physiological responses to stress in rats: Biting may reduce allostatic load. Brain Res. 2007, 1185, 189–194. [Google Scholar] [CrossRef] [PubMed]
- Briese, E. Cold increases and warmth diminishes stress-induced rise of colonic temperature in rats. Physiol. Behav. 1992, 51, 881–883. [Google Scholar] [CrossRef]
- Kim, K.H.; Bang, S.W.; Kim, S.R. Emotion recognition system using short-term monitoring of physiological signals. Med. Biol. Eng. Comput. 2004, 42, 419–427. [Google Scholar] [CrossRef]
- Leijdekkers, P.; Gay, V.; Wong, F. CaptureMyEmotion: A mobile app to improve emotion learning for autistic children using sensors. In Proceedings of the 26th IEEE International Symposium on Computer-Based Medical Systems, Porto, Portugal, 20–22 June 2013; pp. 381–384.
- Choi, J.-S.; Bang, J.; Heo, H.; Park, K. Evaluation of Fear Using Nonintrusive Measurement of Multimodal Sensors. Sensors 2015, 15, 17507–17533. [Google Scholar] [CrossRef] [Green Version]
- Nakanishi, R.; Imai-Matsumura, K. Facial skin temperature decreases in infants with joyful expression. Infant Behav. Dev. 2008, 31, 137–144. [Google Scholar] [CrossRef]
- Bruno, P.; Melnyk, V.; Völckner, F. Temperature and emotions: Effects of physical temperature on responses to emotional advertising. Int. J. Res. Mark. 2017, 34, 302–320. [Google Scholar] [CrossRef]
- Sonkusare, S.; Ahmedt-Aristizabal, D.; Aburn, M.J.; Nguyen, V.T.; Pang, T.; Frydman, S.; Denman, S.; Fookes, C.; Breakspear, M.; Guo, C.C. Detecting changes in facial temperature induced by a sudden auditory stimulus based on deep learning-assisted face tracking. Sci. Rep. 2019, 9, 4729. [Google Scholar] [CrossRef] [PubMed]
- Van Marken Lichtenbelt, W.D.; Daanen, H.A.M.; Wouters, L.; Fronczek, R.; Raymann, R.J.E.M.; Severens, N.M.W.; Van Someren, E.J.W. Evaluation of wireless determination of skin temperature using iButtons. Physiol. Behav. 2006, 88, 489–497. [Google Scholar] [CrossRef] [PubMed]
- Nasoz, F.; Alvarez, K.; Lisetti, C.L.; Finkelstein, N. Emotion recognition from physiological signals using wireless sensors for presence technologies. Cognit. Technol. Work 2004, 6, 4–14. [Google Scholar] [CrossRef]
- Puri, C.; Olson, L.; Pavlidis, I.; Levine, J.; Starren, J. Stresscam: Non-contact measurement of users’ emotional states through thermal imaging. In Proceedings of the Conference on Human Factors in Computing Systems (CHI EA 2005), Portland, OR, USA, 2–7 April 2005; pp. 1725–1728. [Google Scholar]
- Zong, C.; Chetouani, M. Hilbert-Huang transform based physiological signals analysis for emotion recognition. In Proceedings of the 2009 IEEE International Symposium on Signal Processing and Information Technology (ISSPIT), Ajman, UAE, 14–17 December 2009; pp. 334–339. [Google Scholar]
- Paul Ekman, W.V.F. A technique for the measurement of facial action. In Facial Action Coding System (FACS); Paul Ekman Group: Manchester, UK, 1978. [Google Scholar]
- Matzke, B.; Herpertz, S.C.; Berger, C.; Fleischer, M.; Domes, G. Facial Reactions during Emotion Recognition in Borderline Personality Disorder: A Facial Electromyography Study. Psychopathology 2014, 47, 101–110. [Google Scholar] [CrossRef] [PubMed]
- Turabzadeh, S.; Meng, H.; Swash, R.; Pleva, M.; Juhar, J. Facial Expression Emotion Detection for Real-Time Embedded Systems. Technologies 2018, 6, 17. [Google Scholar] [CrossRef] [Green Version]
- Huang, Y.; Chen, F.; Lv, S.; Wang, X. Facial Expression Recognition: A Survey. Symmetry 2019, 11, 1189. [Google Scholar] [CrossRef] [Green Version]
- Weyers, P.; Muhlberger, A.; Hefele, C.; Pauli, P. Electromyographic responses to static and dynamic avatar emotional facial expressions. Psychophysiology 2006, 43, 450–453. [Google Scholar] [CrossRef]
- Zahak, M. Signal Acquisition Using Surface EMG and Circuit Design Considerations for Robotic Prosthesis. In Computational Intelligence in Electromyography Analysis–A Perspective on Current Applications and Future Challenges; InTech: London, UK, 2012. [Google Scholar]
- Boxtel, A. Van Facial EMG as a tool for inferring affective states. Proc. Meas. Behav. 2010, 2010, 104–108. [Google Scholar]
- EMG Electrodes—Supplies. Available online: https://bio-medical.com/supplies/emg-electrodes.html?p=2 (accessed on 7 November 2019).
- Wioleta, S. Using physiological signals for emotion recognition. In Proceedings of the 2013 6th International Conference on Human System Interactions (HSI), Gdansk, Poland, 6–8 June 2013; pp. 556–561. [Google Scholar]
- Girardi, D.; Lanubile, F.; Novielli, N. Emotion detection using noninvasive low cost sensors. In Proceedings of the 2017 Seventh International Conference on Affective Computing and Intelligent Interaction (ACII), San Antonio, TX, USA, 23–26 October 2017; pp. 125–130. [Google Scholar]
- Martínez-Rodrigo, A.; Zangróniz, R.; Pastor, J.M.; Latorre, J.M.; Fernández-Caballero, A. Emotion Detection in Ageing Adults from Physiological Sensors. In Proceedings of the Advances in Intelligent Systems and Computing; Springer: Cham, Switzerland, 2015; Volume 376, pp. 253–261. [Google Scholar]
- Nakasone, A.; Prendinger, H.; Ishizuka, M. ProComp Infiniti Bio-signal Encoder. In Proceedings of the 5th International Workshop on Biosignal Interpretation, Tokyo, Janpan, 6–8 September 2005; pp. 219–222. [Google Scholar]
- Wagner, J.; Kim, J.; Andre, E. From Physiological Signals to Emotions: Implementing and Comparing Selected Methods for Feature Extraction and Classification. In Proceedings of the 2005 IEEE International Conference on Multimedia and Expo, Amsterdam, The Netherlands, 6–8 July 2005; pp. 940–943. [Google Scholar]
- Furman, J.M.; Wuyts, F.L. Vestibular Laboratory Testing. In Aminoff’s Electrodiagnosis in Clinical Neurology; Saunders: Philadelphia, PA, USA, 2012; pp. 699–723. [Google Scholar]
- Lord, M.P.; Wright, W.D. The investigation of eye movements. Rep. Prog. Phys. 1950, 13, 1–23. [Google Scholar]
- Aguiñaga, A.R.; Lopez Ramirez, M.; Alanis Garza, A.; Baltazar, R.; Zamudio, V.M. Emotion analysis through physiological measurements. In Workshop Proceedings of the 9th International Conference on Intelligent Environments; IOS Press: Amsterdam, The Netherlands, 2013; pp. 97–106. [Google Scholar]
- Picot, A.; Charbonnier, S.; Caplier, A. EOG-based drowsiness detection: Comparison between a fuzzy system and two supervised learning classifiers. IFAC Proc. Vol. 2011, 44, 14283–14288. [Google Scholar] [CrossRef] [Green Version]
- Ramkumar, S.; Sathesh Kumar, K.; Dhiliphan Rajkumar, T.; Ilayaraja, M.; Shankar, K. A review-classification of electrooculogram based human computer interfaces. Biomed. Res. 2018, 29, 1078–1084. [Google Scholar]
- Siddiqui, U.; Shaikh, A.N. An Overview of “Electrooculography”. Int. J. Adv. Res. Comput. Commun. Eng. 2013, 2, 4238–4330. [Google Scholar]
- Perdiz, J.; Pires, G.; Nunes, U.J. Emotional state detection based on EMG and EOG biosignals: A short survey. In Proceedings of the 2017 IEEE 5th Portuguese Meeting on Bioengineering (ENBENG), Coimbra, Portugal, 16–18 February 2017; pp. 1–4. [Google Scholar]
- Cruz, A.; Garcia, D.; Pires, G.; Nunes, U. Facial Expression Recognition based on EOG toward Emotion Detection for Human-Robot Interaction. In Proceedings of the International Conference on Bio-inspired Systems and Signal Processing, SCITEPRESS—Science and and Technology Publications, Lisbon, Portugal, 2–15 January 2015; pp. 31–37.
- Chai, X.; Wang, Q.; Zhao, Y.; Li, Y.; Liu, D.; Liu, X.; Bai, O. A Fast, Efficient Domain Adaptation Technique for Cross-Domain Electroencephalography(EEG)-Based Emotion Recognition. Sensors 2017, 17, 1014. [Google Scholar] [CrossRef] [Green Version]
- Wang, Y.; Lv, Z.; Zheng, Y. Automatic Emotion Perception Using Eye Movement Information for E-Healthcare Systems. Sensors 2018, 18, 2826. [Google Scholar] [CrossRef] [Green Version]
- Paul, S.; Banerjee, A.; Tibarewala, D.N. Emotional eye movement analysis using electrooculography signal. Int. J. Biomed. Eng. Technol. 2017, 23, 59. [Google Scholar] [CrossRef]
- Soundariya, R.S.; Renuga, R. Eye movement based emotion recognition using electrooculography. In Proceedings of the 2017 Innovations in Power and Advanced Computing Technologies (i-PACT), Vellore, India, 21–22 April 2017; pp. 1–5. [Google Scholar]
- Bulling, A.; Ward, J.A.; Gellersen, H.; Tröster, G. Eye movement analysis for activity recognition using electrooculography. IEEE Trans. Pattern Anal. Mach. Intell. 2011, 33, 741–753. [Google Scholar] [CrossRef]
- Saneiro, M.; Santos, O.C.; Salmeron-Majadas, S.; Boticario, J.G. Towards Emotion Detection in Educational Scenarios from Facial Expressions and Body Movements through Multimodal Approaches. Sci. World J. 2014, 2014, 1–14. [Google Scholar] [CrossRef]
- Yi Li Hand gesture recognition using Kinect. In Proceedings of the 2012 IEEE International Conference on Computer Science and Automation Engineering, Zhangjiajie, China, 25–27 May 2012; pp. 196–199.
- Schindler, K.; Van Gool, L.; de Gelder, B. Recognizing emotions expressed by body pose: A biologically inspired neural model. Neural Netw. 2008, 21, 1238–1246. [Google Scholar] [CrossRef]
- Farnsworth, B. Facial Action Coding System (FACS)—A Visual Guidebook. Available online: https://imotions.com/blog/facial-action-coding-system/ (accessed on 9 November 2019).
- Shan, C.; Gong, S.; McOwan, P.W. Beyond facial expressions: Learning human emotion from body gestures. 2007. Available online: https://www.dcs.warwick.ac.uk/bmvc2007/proceedings/CD-ROM/papers/276/bmvc07_v2.pdf (accessed on 9 November 2019).
- Gavrilescu, M. Recognizing emotions from videos by studying facial expressions, body postures and hand gestures. In Proceedings of the 2015 23rd Telecommunications Forum Telfor (TELFOR), Belgrade, Serbia, 24–26 November 2015; pp. 720–723. [Google Scholar]
- Metri, P.; Ghorpade, J.; Butalia, A. Facial Emotion Recognition Using Context Based Multimodal Approach. Int. J. Interact. Multimed. Artif. Intell. 2011, 1, 12. [Google Scholar] [CrossRef]
- Lee, S.; Bae, M.; Lee, W.; Kim, H. CEPP: Perceiving the Emotional State of the User Based on Body Posture. Appl. Sci. 2017, 7, 978. [Google Scholar] [CrossRef] [Green Version]
- Van den Stock, J.; Righart, R.; de Gelder, B. Body Expressions Influence Recognition of Emotions in the Face and Voice. Emotion 2007, 7, 487–494. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Castellano, G.; Kessous, L.; Caridakis, G. Emotion Recognition through Multiple Modalities: Face, Body Gesture, Speech. In Affect and Emotion in Human-Computer Interaction; Springer: Berlin/Heidelberg, Germany, 2008; pp. 92–103. [Google Scholar]
- Subramanian, R.; Wache, J.; Abadi, M.K.; Vieriu, R.L.; Winkler, S.; Sebe, N. ASCERTAIN: Emotion and Personality Recognition Using Commercial Sensors. IEEE Trans. Affect. Comput. 2018, 9, 147–160. [Google Scholar] [CrossRef]
- Gay, V.; Leijdekkers, P.; Wong, F. Using sensors and facial expression recognition to personalize emotion learning for autistic children. Stud. Health Technol. Inform. 2013, 189, 71–76. [Google Scholar]
- Ganzha, M.; Maciaszek, L.; Paprzycki, M.; Polskie Towarzystwo Informatyczne; Institute of Electrical and Electronics Engineers; Polskie Towarzystwo Informatyczne; Mazovia Chapter; Institute of Electrical and Electronics Engineers, Region 8; IEEE Poland Section; IEEE Computational Intelligence Society. Computer Society Chapter. In Proceedings of the 2018 Federated Conference on Computer Science and Information Systems, Poznań, Poland, 9–12 September 2018; Ganzha, M., Maciaszek, L., Paprzycki, M., Eds.; Polskie Towarzystwo Informatyczne: Warszawa, Poland; Institute of Electrical andElectronics Engineers: New York, NY, USA, 2018; ISBN 9788360810903. [Google Scholar]
- Lee, K.; Hong, H.; Park, K. Fuzzy System-Based Fear Estimation Based on the Symmetrical Characteristics of Face and Facial Feature Points. Symmetry 2017, 9, 102. [Google Scholar] [CrossRef] [Green Version]
- Sapiński, T.; Kamińska, D.; Pelikant, A.; Anbarjafari, G. Emotion Recognition from Skeletal Movements. Entropy 2019, 21, 646. [Google Scholar] [CrossRef] [Green Version]
- Lisetti, C.L.; Nasoz, F. Using noninvasive wearable computers to recognize human emotions from physiological signals. EURASIP J. Appl. Signal. Process. 2004, 2004, 1672–1687. [Google Scholar] [CrossRef] [Green Version]
- Li, L.; Chen, J.H. Emotion recognition using physiological signals. In Advances in Artificial Reality and Tele-Existence; Lecture Notes in Computer Science; Springer: Berlin/Heidelberg, Geramny, 2006; Volume 4282, pp. 437–446. [Google Scholar]
- Patel, M.; Lal, S.K.L.; Kavanagh, D.; Rossiter, P. Applying neural network analysis on heart rate variability data to assess driver fatigue. Expert Syst. Appl. 2011, 38, 7235–7242. [Google Scholar] [CrossRef]
- Jang, E.H.; Park, B.J.; Kim, S.H.; Chung, M.A.; Sohn, J.H. Classification of three emotions by machine learning algorithms using psychophysiological signals. Int. J. Psychophysiol. 2012, 85, 402–403. [Google Scholar] [CrossRef]
- Soleymani, M.; Pantic, M.; Pun, T. Multimodal emotion recognition in response to videos. IEEE Trans. Affect. Comput. 2012, 3, 211–223. [Google Scholar] [CrossRef] [Green Version]
- Chang, C.Y.; Chang, C.W.; Zheng, J.Y.; Chung, P.C. Physiological emotion analysis using support vector regression. Neurocomputing 2013, 122, 79–87. [Google Scholar] [CrossRef]
- Liu, Y.; Ritchie, J.M.; Lim, T.; Kosmadoudi, Z.; Sivanathan, A.; Sung, R.C.W. A fuzzy psycho-physiological approach to enable the understanding of an engineer’s affect status during CAD activities. CAD Comput.-Aided. Des. 2014, 54, 19–38. [Google Scholar] [CrossRef] [Green Version]
- Verma, G.K.; Tiwary, U.S. Multimodal fusion framework: A multiresolution approach for emotion classification and recognition from physiological signals. Neuroimage 2014, 102, 162–172. [Google Scholar] [CrossRef]
- Nasoz, F.; Lisetti, C.L.; Vasilakos, A.V. Affectively intelligent and adaptive car interfaces. Inf. Sci. (Ny). 2010, 180, 3817–3836. [Google Scholar] [CrossRef]
- Regtien, P.P.L. Sensors for Mechatronics, 2nd ed.; Elsevier: Amsterdam, The Netherelands, 2012. [Google Scholar]
- Takahashi, K. Remarks on Emotion Recognition from Bio-Potential Signals. In Proceedings of the IEEE International Conference on Industrial Technology, Hammamet, Tunisia, 8–10 December 2004; Volume 3, pp. 1138–1143. [Google Scholar]
- Lin, C.J.; Lin, C.-H.; Wang, S.-H.; Wu, C.-H. Multiple Convolutional Neural Networks Fusion Using Improved Fuzzy Integral for Facial Emotion Recognition. Appl. Sci. 2019, 9, 2593. [Google Scholar] [CrossRef] [Green Version]
- Zucco, C.; Calabrese, B.; Cannataro, M. Sentiment analysis and affective computing for depression monitoring. In Proceedings of the 2017 IEEE International Conference on Bioinformatics and Biomedicine (BIBM 2017), Kansas, MO, USA, 13–16 November 2017; Institute of Electrical and Electronics Engineers Inc.: Kansas City, MO, USA, 2017; Volume 2017, pp. 1988–1995. [Google Scholar]
- Picard, R.W. Affective computing: Challenges. Int. J. Hum. Comput. Stud. 2003, 59, 55–64. [Google Scholar] [CrossRef]
- Picard, R.W. Affective Computing: From laughter to IEEE. IEEE Trans. Affect. Comput. 2010, 1, 11–17. [Google Scholar] [CrossRef] [Green Version]
- Poria, S.; Cambria, E.; Bajpai, R.; Hussain, A. A review of affective computing: From unimodal analysis to multimodal fusion. Inf. Fusion 2017, 37, 98–125. [Google Scholar] [CrossRef] [Green Version]
Type of Waves | Related Emotional State | Short Description |
---|---|---|
Delta (δ) (0.5–4 Hz) | Strong sense of empathy and intuition | The slowest brain waves often associated with sleep. Multiple frequencies in this range are accompanied by the release of human growth hormone, which is useful in healing. These waves produced in the waking state show an opportunity to access the subconscious activity. |
Theta (θ) (4–8 Hz) | Deep relaxation, meditation | Mainly adults produce the theta brain waves, when the person is in the light sleep or in dreams. These waves normally appear with closing the eyes and disappears with opening of eyes. Frequency of these waves is mainly associated stress relief and memory recollection. Twilight conditions can be used to reach deeper meditation resulting in improved health, as well as increasing creativity and learning capabilities |
Alpha (α) (8–16 Hz) | Creativity, Relaxation | These waves mostly present during the state of awake relaxation with eyes closed. Alpha is the resting state for the brain. Activity of alpha waves decreases in response to all types of motor activities. Alpha waves aid overall mental coordination, calmness, alertness, mind/body integration, and learning |
Beta (β) (16–32 Hz) | Beware, Concentration. | The beta waves are produced when the person is in an alert or anxious state, and it is a dominant rhythm. Usually, they are generated in the frontal and central part of the brain. In this state, brains can easily perform: analysis, preparations of the information, generate solutions and new ideas. |
Gamma (γ) (32 Hz-above) | Regional Learning, Memory and Language Processing and Ideation. | These waves are emitted when a person is in the abnormal condition or there will be some mental disorder. Gamma brainwaves are the fastest of brain waves and relate to simultaneous processing of information from different brain areas. Numerous theories have proposed that gamma contributes directly to brain function, but others argue that gamma is better viewed as a simple byproduct of network activity |
Aim | Emotions | Hardware and Software | Ref. |
---|---|---|---|
Creation of emotion classification system using EEG signals. | High/low arousal and valence. | 5 channels wireless headset Emotiv Insight | [52] |
Creation of new emotions evaluation technique based on a three-layer EEG-ER scheme. | High/low arousal and valence | Electro-cap (Qucik-Cap 64) from NeuroScan system (Compumedics Inc., Charlotte, NC, USA) | [53] |
Research of Relief-based channel selection methods for EEG-based emotion recognition | Joy, fear, sadness, relaxation | − | [54] |
Creation of an intelligent emotion recognition system for the improvement of special students learning process | Happy, calmness, sadness, scare | Emotiv-EPOC System. 14 electrodes with two reference channels were used | [55] |
Automated human emotions recognition from EEG signal using higher order statistics methods. | High/low arousal and valence | The EEG input signals were provided by the DEAP database | [56] |
Creation of new methodic for recognition of human emotions | High/low arousal and valence | Multi-channel EEG device was used | [57] |
New EEG-based emotion recognition approach with a novel time-frequency feature extraction technique is presented | High/low arousal and valence | The EEG signals provided by the DEAP dataset | [58] |
New deep learning framework based on a multiband feature matrix (MFM) and a capsule network (CapsNet) is proposed. | High/low arousal, valence and dominance | The DEAP dataset was used | [59] |
New cross-subject emotion recognition model based on the newly designed multiple transferable recursive feature elimination are developed | High/low arousal, valence and dominance | 32 channel data from DEAP dataset was used to validate the proposed method | [60] |
Presented novel approach based on the multiscale information analysis (MIA) of EEG signals for distinguishing emotional. | High/low arousal and valence | The EEG input signals were provided by the DEAP dataset | [61] |
Parameter | Duration, s | Amplitude, mV | Short Description |
---|---|---|---|
P | ~0.04 | ~0.1–0.25 | This wave is a result from strial contraction (or depolarization). P wave that exceeds typical values might indicate atria hypertrophy. |
PR | 0.12–0.20 | – | The PR interval measured from the start of the P wave to the start of Q wave. It represents the duration of atria depolarization (contraction). |
QRS Complex | 0.08–0.12 | The QRS complex measured from the start of Q wave to the end of S wave. It represents the duration of ventricle depolarization (contraction). If the duration is longer, it might indicate the presence of bundle branch blocks. | |
QT/QTc | ~0.41 | It is measured from the start of the Q wave to the end of T wave. QT interval represents the duration of contraction and relaxation of the ventricles. Duration of QT/QTc varies inversely with the heart rate. |
Aim | Emotions | Methods | Hardware and Software | Ref. |
---|---|---|---|---|
Study focuses on emotion recognition for service robots in the living space | High/neutral/low valence. Negative arousal categorized into: sadness, anger, disgust, and fear | ECG | Wireless bio sensor RF-ECG | [1] |
This research suggests an ensemble learning approach for developing a machine learning model that can recognize four major human emotions | Anger; sadness; joy; and pleasure | ECG | Spiker-Shield Heart and Brain sensor | [51] |
Creation of new methodology for the evaluation of interactive entertainment technologies. | Level of arousal | ECG, galvanic skin response (GSR), electromyography of the face, heart rate | Digital camera, ProComp Infiniti system and sensors, BioGraph Software from Thought Technologies. | [4] |
Presentation of new AfC methodology capable of recognizing the emotional state of a subject. | High/low valence and arousal | ECG, EEG | B-Alert X10 sensor (Advanced Brain Monitoring, Inc., USA) | [77] |
Proposed new method for the automatic location of P-QRS-T wave, and automatic feature extraction | Joy and sadness | ECG | BIOPAC System MP150 | [73] |
Aim | Emotions | Methods | Hardware and Software | Ref. |
---|---|---|---|---|
Stress level evaluation in computer human interaction. | Stress | GSR, eye activity | Mindfield eSense sensor, Tobii eye-tracker environment (Tobii Studio) | [34] |
Creation of textile wearable system, which is able to perform an exosomatic EDA measurement using AC and DC methods. | Level of arousal | GSR | Textile electrodes, from Smartex s.r.l. (Pisa, Italy), installed into special glove | [89] |
Research of proposed methodologies for emotions recognition from physiological signals | Valence and arousal levels | GSR, heart rate | Polar-based system, Armband from Bodymedia | [90] |
Assessment of human emotions using peripheral as well as EEG physiological signals on short-time periods | High/neutral/low valence and arousal | GSR, EEG, blood pressure | Biosemi Active II system (http://www.biosemi.com), plethysmograph to measure blood pressure | [91] |
Assessment of human emotion from physiological signals by means of pattern recognition and classification techniques | High/low valence and arousal | GSR, EEG, blood pressure, a respiration, temperature | Biosemi Active II device (http://www.biosemi.com), GSR sensor, plethysmograph, respiration belt and a temperature sensor | [92] |
Creation of wearable system for measuring emotion-related physiological parameters | – | GSR, heart rate, skin temperature | Originally designed glove with installed sensors | [93] |
Validation of new method for the emotional experience evaluation extracting semantic information from the autonomic nervous system | High/low valence and arousal | GSR, ECG, heart rate, | Bodymedia Armband, InnerView Research Software 4.1 from Bodymedia | [94] |
Development of two state emotion recognition engine for mobile phone | Pleasant unpleasant | GSR, Photoplethysmogram (PPG), Skin temperature | – | [95] |
Aim | Emotions | Methods | Hardware and Software | Ref. |
---|---|---|---|---|
Objective of this study was to recognize emotions using EEG and peripheral signals. | High/low valence and arousal | HRV, EEG, GSR, blood pressure, respiration | Biosemi Active II system (http//www.biosemi.com). GSR sensor, plethysmograph, respiration belt | [112] |
Creation of new method for the identification of happiness and sadness | Happiness and sadness | HRV, skin Temperature (SKT), | SKT sensor, PPG sensor | [113] |
Aim of this project was to design a noninvasive system which will be capable of recognizing human emotions using smart sensors | Happiness (excitement), sadness, relaxed (neutral), and angry | HRV, skin temperature SKT, GSR | Custom made PPG sensor, DS600 temperature sensor by Maxim—Dallas semiconductor, Custom made GSR sensor | [114] |
This article describes the development of a wearable sensor platform to monitor a mental stress. | Mental stress | HRV, GSR, respiration | Heart rate monitor (HRM) (Polar WearLink+; Polar Electro Inc.), Respiration sensor (SA9311M; Thought Technology Ltd.), GSR sensor (E243; In Vivo Metric Systems Corp.). EMG module (TDE205; Bio-Medical Instruments, Inc.) | [115] |
This paper investigated the ability of PPG to recognize emotion | High/low valence and arousal | HRV | PPG sensor | [116] |
The present research proposes a novel emotion recognition framework for the computer prediction of human emotions using wearable biosensors | Happiness/Joy, anger, fear, disgust, sadness | HRV, GSR, SKT, Activity recognition | PPG sensor, GSR sensor, SKT, fingertip temperature; EMG gyroscopes and accelerometer for activity recognition, Android smartphone for data collection | [117] |
Aim | Emotions | Methods | Hardware and Software | Ref. |
---|---|---|---|---|
This paper investigates computational emotion recognition using multimodal physiological signals | Positive, negative and neutral arousal | PPG, GSR, respiration rate skin temperature | Pulse oximeter (PP-CO12, TEAC Co.) GSR, (PPS-EDA, TEAC Co, AP-U030, TEAC Co.), respiration rate sensor (AP-C021, TEAC Co.), temperature sensor clip (AP-C050, TEAC Co.) | [121] |
This paper introduces an automated approach in emotion recognition, based on several bio signals | Stress, disappointment, euphoria | Electromyograms (EMGs), ECG, respiration rate, and GSR. | EMG textile fireproof sensors; the ECG and respiration sensors on the thorax; the GSR textile and fireproof sensor placed special glove | [122] |
To compare time, frequency, and time-frequency features derived from thermal infrared data discriminates between self-reported affective states of an individual in response to visual stimuli drawn from the international affective pictures system | High/neutral/low valence and arousal | Facial thermal infrared data, blood volume pulse (BVP), and respiratory effort | FLIR Systems ThermaCAM (SC640) long wavelength infrared (LWIR) camera, piezo crystal respiratory effort sensor belt 1370G by Grass Technologies, BVP sensor (PPS) by Grass Technologies Atmospheric temperature sensor HS-2000D | [123] |
Design experimental stand which is used in monitoring human-system interaction | High/low arousal | GSR, Electromyography (EMGs,) respiration rate, EEG, blood-volume pulse, temperature | SC-Flex/Pro sensor, MyoScan Pro EMG, Respiration rate sensor, EEG-Z sensor, HRV/BVP Flex/Pro sensor Temperature sensor | [124] |
This paper aims at assessing human emotion recognition by means of the analysis of HRV with varying spectral bands based on respiratory frequency | High/neutral/low arousal | ECG, respiration rate, blood pressure (BP), skin temperature (ST) GSR | EEG, blood pressure, skin temperature and GSR sensors | [125] |
Aim | Emotions | Methods | Hardware and Software | Ref. |
---|---|---|---|---|
Present App for smartphones CaptureMyEmotion, which can improve learning process of autistic children. | High/low arousal | SKT, GSR, motion analysis | Q sensor from Affectiva (www.affectiva.com). | [134] |
Proposed a new method for evaluating fear, based on nonintrusive measurements obtained using multiple sensors | Fear | EEG, SKT, eye blinking rate | EEG device (Emotiv EPOC), commercial thermal camera (ICI 7320 Pro) commercial web-camera (C600) and a high-speed camera | [135] |
Study infant emotion rely on the assessment of expressive behavior and physiological response | Joyful emotion | SKT | Thermal imaging system (TH3104MR, NEC, Sanei) | [136] |
To demonstrate that the effects of particular emotional stimuli depend not only on physical temperatures but also on homeostasis/thermoregulation. | Emotionally warm or emotionally cold state | - | – | [137] |
Present new methodology which offers a sensitive and robust tool to automatically capture facial physiological changes | High/low valence and arousal | SKT, ECG, GSR | ECG and GSR National Instruments (NI) devices, infrared camera FLIR A615 | [138] |
Evaluation of possibility of wireless determination of skin temperature using iButtons | – | – | iButton (type DS1921H; Maxim/Dallas Semiconductor Corp., USA) | [139] |
Present a new approach how to analyze the physiological signals associated with emotions | Sadness, amusement, fear, anger, surprise | SKT, GSR | BodyMedia, SenseWear armband | [140] |
Present a new StressCam methodology for the non-contact evaluation of stress level | Stress | SKT | Infrared camera | [141] |
Emotion | Involved Muscles | Actions |
---|---|---|
Happiness | Orbicularis oculi, Zygomaticus major | Closing eyelids, pulling mouth corners upward and laterally |
Surprise | Frontalis, Levator palpebrae superioris | Raising eyebrows, raising upper eyelid |
Fear | Frontalis, Corrugator supercilii, Levator palpebrae superioris | Raising eyebrows, lowering eyebrows, raising upper eyelid |
Anger | Corrugator supercilii, Levator palpebrae superioris, Orbicularis oculi | Lowering eyebrows, raising upper eyelid, closing eyelids |
Sadness | Frontalis, Corrugator supercilii, Depressor angulioris | Raising eyebrows, lowering eyebrows, depressing lip corners |
Disgust | Levator labii superioris, Levator labii superioris alaeque nasi | Raising upper lip, raising upper lip and wrinkling nasal skin |
Aim | Emotions | Methods | Hardware and Software | Ref. |
---|---|---|---|---|
Research of possibility to reliably recognize emotional state by relying on noninvasive low-cost EEG, EMG, and GSR sensors | High/low valence and arousal | EEG, GSR, EMG, HRV | BrainLink headset, Neuroview acquisition software, Shimmer GSR+Unit Shimmer EMG device, a plethysmograph | [152] |
Present a new approach for monitoring and detecting the emotional state in elderly | High/low arousal | EDA, HRV, EMG, SKT, activity tracker | EDA-custom made sensor, a plethysmograph, SKT resistance temperature detector, 3-axis accelerometer | [153] |
Present a model that allows to determine emotion in real time | High/low valence and arousal | EMG, GSR | ProComp Infiniti Bio-signal Encoder, GSR sensor | [154] |
Present a methodology and a wearable system for the evaluation of the emotional states of car-racing drivers | Anger, fear, disgust, sadness, enjoyment and surprise | EMG, GSR, ECG, respiration rate | EMG textile fireproof sensors; ECG and respiration sensors on the thorax of the driver; the GSR sensor in the glow | [122] |
Present fully implemented emotion recognition system including data analysis and classification | Joy, anger, pleasure, sadness | EMG, ECG, GSR, respiration rate | Four-channel EMG, ECG, GSR, respiration rate bio sensor | [155] |
Aim | Emotions | Methods | Hardware and Software | Ref. |
---|---|---|---|---|
Present a novel strategy (ASFM) for emotions recognitions | Positive, neutral, negative emotions | EMG, EOG | Off-line experiment was performed using SEED datasets | [164] |
Present a novel approach for a sensor-based E-Healthcare system, | Positive, neutral, negative emotions | EOG, IROG | Neuroscan system (Compumedics Neuroscan, Charlotte, NC, USA), infrared camera with the resolution of 1280 × 720 | [165] |
Proposed a new approach towards to the recognition of emotions using stimulated EOG signals | Positive, neutral, negative emotions | EOG | Customized EOG data acquisition device, Ag/AgCl electrodes | [166] |
The proposed system introduces an emotion recognition system, based on human eye movement | Happy, sad, angry, afraid, pleasant | EOG | Video-based eye trackers | [167] |
Present a novel strategy of eye movement analysis as a new modality for recognizing human activity. | Arousal level | EOG | Commercial system Mobi from Twente Medical Systems International (TMSI) | [168] |
Emotions | Gestures and Postures |
---|---|
Happiness | Body extended, shoulders up, arms lifted up or away from the body |
Interest | Lateral hand and arm movement and arm stretched out frontal |
Surprise | Right/left hand going to the head, two hands covering the cheeks self-touch two hands covering the mouth head shaking body shift–backing |
Boredom | Raising the chin (moving the head backward), collapsed body posture, and head bent sideways, covering the face with two hands |
Disgust | Shoulders forward, head downward and upper body collapsed, and arms crossed in front of the chest, hands close to the body |
Hot anger | Lifting the shoulder, opening and closing hand, arms stretched out frontal, pointing, and shoulders squared |
Aim | Emotions | Methods | Hardware and Software | Ref. |
---|---|---|---|---|
Presentation of ASCERTAIN-a multimodal database for implicit personality and Affect recognition using commercial physiological sensors. | High/low valence and arousal | GSR, EEG, ECG, HRV, facial expressions | GSR sensor, ECG sensor, EEG sensor, webcam to record facial activity Lucid Scribe software | [179] |
Creation of personalized tool for a child to learn and discuss her feelings | Real time arousal and stress level | Facial expression recognition | Smartphone camera, application CaptureMyEmotion | [180] |
This paper aims to explore the limitations of the automatic affect recognition applied in the usability context as well as to propose a set of criteria to select input channels for affect recognition. | Valence and arousal, interest, slight confusion, joy, sense of control | GSR, facial expressions, | Infiniti Physiology Suite software; standard internet camera and video capture software from Logitech, Noldus FaceReader, Morae GSR recorder | [181] |
This study proposes a new method that involves analysis of multiple data considering the symmetrical characteristics of face and facial feature points | Fear | Movement of facial feature points such as eyes, nose, and mouth | FLIR Tau2 640 thermal cameras, NIR filter, Logitech C600 web-camera | [182] |
Present a novel method, for computerized emotion perception based on posture to determine the emotional state of the user. | Happiness, interest, boredom, disgust, hot anger | Body postures | C++ in Ubuntu 14.04. Kinect for Microsoft Xbox 360 and OpenNI SDK | [176] |
To propose a novel method to recognize seven basic emotional states utilizing body movement | Happiness, sadness, surprise, fear, anger, disgust and neutral state | Gestures and body movements | Kinect v2 sensor | [183] |
Emotions | Measurement Methods | Data Analysis Methods | Accuracy | Ref. |
---|---|---|---|---|
Sadness, anger, stress, surprise | ECG, SKT, GSR | SVM | Correct-classification ratios were 78.4% and 61.8%, for the recognition of three and four categories, respectively | [133] |
Sadness, anger, fear, surprise, frustration, and amusement | GSR, HRV, SKT | KNN, DFA, MBP | KNN, DFA, and MBP, could categorize emotions with 72.3%, 75.0%, and 84.1% accuracy, respectively | [184] |
Three levels of driver stress | ECG, EOG, GSR and respiration | Fisher projection matrix and a linear discriminant | Three levels of driver stress with an accuracy of over 97% | [126] |
Fear, neutral, joy | ECG, SKT, GSR, respiration | Canonical correlation analysis | Correct-classification ratio is 85.3%. The classification rates for fear, neutral, joy were 76%, 94%, 84% respectively | [185] |
The emotional classes identified are high stress, low stress, disappointment, and euphoria | Facial EOG, ECG, GSR, respiration, | SVM and adaptive neuro-fuzzy inference system (ANFIS) | The overall classification rates achieved by using tenfold cross validation are 79.3% and 76.7% for the SVM and the ANFIS, respectively. | [122] |
Fatigue caused by driving for extended hours | HRV | Neural network | The neural network gave an accuracy of 90% | [186] |
Boredom, pain, surprise | GSR, ECG, HRV, SKT | Machine learning algorithms: linear discriminate analysis (LDA), classification and regression tree (CART), self-organizing map (SOM), and SVM | Accuracy rate of LDA was 78.6%, 93.3% in CART, and SOMs provided accuracy of 70.4%. Finally, the result of emotion classification using SVM showed accuracy rate of 100.0%. | [187] |
The arousal classes were calm, medium aroused, and activated and the valence classes were unpleasant, neutral, and pleasant | ECG, pupillary response, gaze distance | Support vector machine | The best classification accuracies of 68.5 percent for three labels of valence and 76.4 percent for three labels of arousal | [188] |
Sadness, fear, pleasure | ECG, GSR, blood volume pulse, pulse. | Support vector regression | Recognition rate up to 89.2% | [189] |
Frustration, satisfaction, engagement, challenge | EEG, GSR, ECG | Fuzzy logic | 84.18% for frustration, 76.83% for satisfaction, 97% for engagement, 97.99% for challenge | [190] |
Terrible, love, hate, sentimental, lovely, happy, fun, shock, cheerful, depressing, exciting, melancholy, mellow | EEG, GSR, blood volume pressure, respiration pattern, SKT, EMG, EOG | Support Vector Machine, Multilayer Perceptron (MLP), K-Nearest Neighbor (KNN) and Meta-multiclass (MMC), | The average accuracies are 81.45%, 74.37%, 57.74% and 75.94% for SVM, MLP, KNN and MMC classifiers respectively. The best accuracy is for ‘Depressing’ with 85.46% using SVM. Accuracy of 85% with 13 emotions | [191] |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Dzedzickis, A.; Kaklauskas, A.; Bucinskas, V. Human Emotion Recognition: Review of Sensors and Methods. Sensors 2020, 20, 592. https://doi.org/10.3390/s20030592
Dzedzickis A, Kaklauskas A, Bucinskas V. Human Emotion Recognition: Review of Sensors and Methods. Sensors. 2020; 20(3):592. https://doi.org/10.3390/s20030592
Chicago/Turabian StyleDzedzickis, Andrius, Artūras Kaklauskas, and Vytautas Bucinskas. 2020. "Human Emotion Recognition: Review of Sensors and Methods" Sensors 20, no. 3: 592. https://doi.org/10.3390/s20030592
APA StyleDzedzickis, A., Kaklauskas, A., & Bucinskas, V. (2020). Human Emotion Recognition: Review of Sensors and Methods. Sensors, 20(3), 592. https://doi.org/10.3390/s20030592