Abstract
Emotional computing is a field of human computer interaction where a system has the ability to recognize emotions and react accordingly. Recognizing Emotional states is becoming a major part of a user’s context for wearable computing applications. The system should be able to acquire the user’s emotional states by using physiological sensors. We want to develop a personal emotional states recognition system that is practical, reliable, and can be used for health-care related applications. We propose to use the eHealth platform which is a ready-made, light weight, small and easy to use device for recognizing few Emotional states like Sad, Dislike, Joy, Stress, Normal, NoIdea, Positive and Negative using decision tree classifier. In this paper, we present an approach to build a system that exhibits this property and provides evidence based on data for 8 different emotional states collected from 24 different subjects. Our results indicate that the system has an accuracy rate of approximately 91 %. In our work, we used three physiological sensors (i.e. BVP, GSR and EMG) in order recognize Emotional states (i.e. Stress, joy/Happy, sad, normal/Neutral, dislike and no idea).
You have full access to this open access chapter, Download conference paper PDF
Similar content being viewed by others
Keywords
- Emotional states
- Electromyogram
- Blood volume pulse
- Galvanic skin response
- Skin temperature
- International Affective Picture System
- Machine learning classifier
- User studies
1 Introduction
It is hard to express your own emotions; no one can accurately measure the degree of his/her emotional state. According to Darwin, “….the young and the old of widely different races, both with man and animals, express the same state of mind by the same movement” [16]. According to Paul Ekman, there are seven basic emotions which are fear, surprise, sad, dislike, disgrace, disgust and joy [14]. The concept behind emotional states (also known as affective computing) was first introduced by Rosalind Picard in 1995 [2]. Since then the affective computing group have produced novel and innovative projects in that domain [3]. Emotional states recognition has received attention in recent years and is able to support the health care industry. Emotions and physical health have a strong link in influencing the immune system too [15]. Due to untreated, chronic stress; occurrence of an emotional disorder is more than 50 % [6]. According to Richmond Hypnosis Center, due to stress; 110 million people die every year. That means, every 2 s, 7 people die [4]. According to American Psychological Association, in 2011 about 53 percent of Americans claimed stress as a reason behind personal health problems [5]. According to Dr. Alexander G. Justicz, in the 21st century, stress is a huge problem for men [9]. Stress affects our health negatively, causing headaches, stomach problems, sleep problems, and migraines. Stress can cause many mouth problems, the painful TMJ (temporomandibular joint) syndrome, and tooth loss [7]. “Stress has an immediate effect on your body. In the short term, that’s not necessarily a bad thing, but chronic stress puts your health at risk” [8]. Long term and intense anger can be caused of mental health problems including depression, anxiety and self-harm. It can also be caused of “high blood pressure”, “cold and flu”, “coronary heart disease”, “stroke”, “cancer” and “gastro-intestinal problems” [13]. “If you have a destructive reaction to anger, you are more likely to have heart attacks” [12] whereas “an upward-spiral dynamic continually reinforces the tie between positive emotions and physical health” [17]. Although the negative effects of stress are known to people, they choose (deliberately or otherwise) to ignore it. They need to be forcefully notified, that they must shrug off negative emotions; either by sending them calls or some video clips/text messages/games [10]. According to number of studies, negative thinking or depression can adversely affect your health [19]. Probably automatic and personal applications can be very helpful if it can monitor one’s emotional states and persuade people to come out of negative emotional states. According to William Atkinson; “The best way to overcome undesirable or negative thoughts and feelings is to cultivate the positive ones” [18]. Emotional recognition technology can tackle this problem as it is able to monitor an individual’s emotional states. This kind of system can also send an alarming call to a person when he is in a negative emotional state for long time or notify the caregivers or family members. The system can also log an individual’s emotional states for later analysis. In some cases, especially in heart diseases, emotional states are also required along with the physical activities and physiological information for doctors in order to examine their patient’s conditions when he is away from the doctor’s clinic [11]. We want to develop a system for recognizing emotional states using physiological sensors which should be able to identify a few emotional states like sad, dislike, joy, stress, normal, no-idea, positive and negative.
2 Related Work
Recognizing emotional states by using automated systems have increased in recent years. Researchers developed systems for recognizing emotional states using speech [23–25], facial expressions [26–28] and physiological devices [20–22, 29, 30]. In this research, we want to recognize different emotional states using body worn physiological devices (EMG, BVP, GSR and temperature). Researchers used physiological devices in order to recognize for different emotional states like sad [20–22, 30], joy/happy [20–22, 30, 31], normal/neutral [21, 30, 31], negative [29] etc. However, the aforementioned researches have used different physiological devices in their work. For example; some researchers recognized emotional states using EEG, GSR and pulse sensor and they recognized joy, anger, sad, fear and relax. Audio and visual clips were used as a stimulus for eliciting the emotions [20]. Some researchers recognized emotional states using ECG and they recognized Happiness, Sad, Fear, Surprise, Disgust, and Neutral. Audio and visual clips were used as a stimulus for eliciting the emotions [21]. Some researchers recognized emotional states using ECG, EMG, skin conductance, respiration sensor and they recognized Joy, anger, Sadness and Pleasure. Music songs were used as a stimulus for eliciting the emotions [22]. In another case, researchers gathered the data from the “blood volume pulse”, “electromyogram”, “respiration” and the “skin conductance sensor”. They conducted 20 experiments in 20 consecutive days, testing around 25 min per day on each individual. They figured out neutral, anger, hate, grief, love, romantic, joy and reverence emotion states from the data. They got 81 % classification accuracy among the eight states [31]. Different techniques can be used as a stimulus for eliciting the emotions i.e. pictures, video clips, audio clips, games etc. In our work, we used International Affective Picture System (IAPS) for stimulation. IAPS is widely used in experiments studying emotion and attention. The International Affective Picture System (IAPS) provides normative emotional stimuli for emotion and attention under experimental investigations. The target is to produce a large set of emotionally-evocative, standardized, color photographs, inter nationally-accessible that includes contents under semantic categories [32]. Above mentioned researchers used different parts of body but in our research we used only left arm for the sensor placement.
3 Hypothesis
The physiological data measured by wearable devices (EMG, blood volume pulse, and skin conductance sensor) indicate which emotion state (Sad, Dislike, Joy, Stress, Normal, NoIdea, positive, Negative) the person is in using machine learning classifier (J48 and IBK).
4 Experimental Methodology
We developed following systems for the user study.
4.1 eHealth Platform and Application
We used eHealth platform [1] in order to recognize emotional states (Fig. 1) and connected Raspberry Pi [41] to eHealth platform (Fig. 2).
The eHealth sensor comes with few sensors like 2D Accelerometer sensor, Blood pressure sensor (Breathing), Pulse and oxygen in blood sensor, body temperature sensor, airflow sensor, Electrocardiogram sensor (ECG), Electromyography sensor (EMG) and Galvanic skin response sensor. We used Galvanic skin response sensor, body temperature sensor, Electromyography sensor (EMG) and we used another blood volume pulse sensor [40]. We connected ‘GSR, ‘EMG’ and ‘BVP’ to the board. We wrote a piece of code which reads the values from the aforementioned sensors and writes it to a network port.
4.2 IAPS and Its Application (Application Stimulus)
We got an access to IAPS [32] images and these images are already used by several researchers for emotional computing [33–39]. We implemented an application in C#.net that shows participants’ IAPS images in a sequence in order to change participants’ emotional states and also states the starting and ending time for each IAPS image during experiments. After showing participants five different images from each group, our application used to ask participants about their current emotional state by using the Likert scale (as shown in below figure) approach. We chose 100 IAPS images from different categories and presented it in following order.
The images were shown as a slide show with a timer of 5 s for each image. For the questionnaire we used radio buttons and participants had to choose one emotional state, the application also stores the participants’ personal information i.e. age, gender, height and weight. Participants were asked to wear sensors on their left arms, palms and fingers (Fig. 3). They were also required to perform the experiments twice; the first experiment was useful in getting the participants to familiarize themselves with the setup, while the second attempt was actually used for analyzing their data.
5 Results and Analysis
We recruited 26 participants (21 males, 5 females) for our experiment setup; two of them could not complete the experiments so we ended up with 24 participants (19 males, 5 females). The range of participants’ age was from 20 to 44 (mean 26.17, SD 5.14) and ranged in BMI (body mass index) from 18.7 to 26.6 (mean 21.44, SD 2.17). Participants were required to conduct the experiment twice and on different days. They were asked to choose one of the following ‘Emotional states’ during experiments:
Normal, Sad, Dislike, Joy, Stress and No-Idea
We received values from three sensors i.e. EMG, GSR and BVP where sample rate was around 650 Hz. Our experimental setup was able to change participants’ emotional states. Only four of the participants chose all of the given emotional states. This was due to the fact that it was hard for the participants to distinguish between sad, dislike and stress. Also being asked to distinguish between joy and normal during experiments was not a straightforward task. That also explains why some emotional states were ignored by participants. “As everyone knows, emotions seem to be interrelated in various but systematic ways: Excitement and depression seem to be opposites; excitement and surprise seem to be more similar to one another; and excitement and joy seem to be highly similar, often indistinguishable” [43]. Therefore, we generated another dataset from our experimental data; we categorized emotional states into two collections:
-
Positive {Joy, Normal}
-
Negative {Sad, Dislike. Stress}; ‘No-Idea’ is excluded
Now, we have the following types of datasets:
-
Type1: It contains {Normal, Sad, Dislike, Joy, Stress and No-Idea}
-
Type2: It contains {Positive and Negative}
Due to the fact that it was a huge dataset, it was not possible for WEKA [44] application to process the data of all 24 participants together. Therefore, we chose small portions of data randomly pertaining to each emotional from each participant. We got two types of data i.e. “Two-Class” and “Six-class”. We applied J48 and IBK classifiers with 10-fold cross validation.
Our results show that J48 and IBK classifiers were able to classify the instances with the accuracies of 97.7584 % and 95.4684 % respectively for ‘Two-Class’. J48 and IBK classifiers were also able to classify the instances with the accuracies of 96.5014 % and 91.9333 % respectively for ‘Six-Class’ (Tables 1 and 2).
6 Conclusion and Future Work
Our system was able to recognize the aforementioned emotional states by using physiological devices and J48 (decision tree) and IBK classifiers with high accuracies. Results have shown that few physiological devices are enough for recognizing required emotional states (‘Sad’, ‘Dislike’, ‘Joy’, ‘Stress’, ‘Normal’, ‘No-Idea’, ‘Positive’ and ‘Negative’). This prototype is only a “proof of concept” and our results show that our approach can identify the above mentioned emotional states independent of BMI (body mass index) and age group. The physiological sensor has to be fixed properly on the participants’ skin in order to predict their emotional states successfully. We will conduct more user studies where we will use physiological data and facial expressions for recognizing these emotional states.
References
Cooking Hacks: e-Health Sensor Platform V2.0 for Arduino and Raspberry Pi [Biometric/Medical Applications]. http://www.cooking-hacks.com/documentation/tutorials/ehealth-biometric-sensor-platform-arduino-raspberry-pi-medical#step4_9
Picard, R.W.: http://web.media.mit.edu/~picard/index.php
Affective Computing: Publications. http://affect.media.mit.edu/publications.php
Richmond Hypnosis Center. http://richmondhypnosiscenter.com/2013/04/12/sample-post-two/
American Psychological Association: The Impact of Stress. http://www.apa.org/news/press/releases/stress/2011/impact.aspx
WebMD: Stress Management Health Center. http://www.webmd.com/balance/stress-management/effects-of-stress-on-your-body
Krifka, S., Spagnuolo, G., Schmalz, G., Schweikl, H.: A review of adaptive mechanisms in cell responses towards oxidative stress caused by dental resin monomers. Biomaterials 34, 4555–4563 (2013)
Healthline: The Effects of stress on the Body. http://www.healthline.com/health/stress/effects-on-body
Miamiherald: Chronic stress is linked to the six leading causes of death. http://www.miamiherald.com/living/article1961770.html
Online Stress Reliever Games. http://stress.lovetoknow.com/Online_Stress_Reliever_Games
Khan, A.M.: Personal state and emotion monitoring by wearable computing and machine learning. In: BCS-HCI 2012, Newcastle, UK (2011)
WebMD: Stress Management Health Center. http://www.webmd.com/balance/stress-management/features/how-anger-hurts-your-heart
BetterHealth: Anger - how it affects people. http://www.betterhealth.vic.gov.au/bhcv2/bhcarticles.nsf/pages/Anger_how_it_affects_people
Ekman, P.: Basic emotions. In: Dalgleish, T., Power, M. (eds.) Handbook of Cognition and Emotion (PDF). Wiley, Sussex (1999)
Health and Wellness: Are Happy People Healthier? New Reasons to Stay Positive. http://www.oprah.com/health/How-Your-Emotions-Affect-Your-Health-and-Immune-System
Darwin, C.: The Expression of the Emotions in Man and Animals. John Murray, London (1872)
Kok, B.E., Coffey, K.A., Cohn, M.A., Catalino, L.I., Vacharkulksemsuk, T., Algoe, S., Brantley, M., Fredrickson, B.L.: How positive emotions build physical health: Perceived positive social connections account for the upward spiral between positive emotions and vagal tone. Psychol. Sci. 24(7), 1123–1132 (2013)
Atkinson, W.W.: Thought Vibration or the Law of Attraction in the Thought World (1908)
Rush, A.J., Beck, A.T., Kovacs, M., Hollon, S.D.: Comparative efficacy of cognitive therapy and pharmacotherapy in the treatment of depressed outpatients. Cognit. Ther. Res. 1, 17–38 (1977)
Remarks on Emotion Recognition from Bio-Potential Signals: research paper (2004)
Emotion Recognition from Electrocardiogram Signals using Hilbert Huang Transform (2012)
Emotion Pattern Recognition Using Physiological Signals (2014)
Sendlmeier, W., Burkhardt, F.: Verification of Acoustical Correlates of Emotional Speech using Formant-Synthesis, Technical University of Berlin, Germany
Dellaert, F., Polzin, T., Waibel, A.: Recognizing Emotion in Speech
Dai, K., Fell, H.J., MacAuslan, J.: Recognizing Emotion in Speech Using Neural Networks
Recognizing Emotion From Facial Expressions: Psychological and Neurological Mechanisms Ralph Adolphs, University of Iowa College of Medicine
Busso, C., et al.: Analysis of emotion recognition using facial expressions, speech and multimodal information. In: Proceedings of the 6th International Conference on Multimodal Interfaces, ICMI 2004, State College, PA, USA, 13–15 October 2004 (2004)
Perikos, I., Ziakopoulos, E., Hatzilygeroudis, I.: Recognizing Emotions from Facial Expressions Using Neural Network
Monajati, M., Abbasi, S.H., Shabaninia, F., Shamekhi, S.: Emotions States Recognition Based on Physiological Parameters by Employing of Fuzzy-Adaptive Resonance Theory
Selvaraj, J., Murugappan, M., Wan, K., Yaacob, S.: Classification of emotional states from electrocardiogram signals: a non-linear approach based on hurst
Healey, J., Picard, R.W.: Eight-emotion Sentics Data, MIT Affective Computing Group (2002). http://affect.media.mit.edu
The Center for the Study of Emotion and Attention. http://csea.phhp.ufl.edu/Media.html#topmedia
Cuthbert, B.N., Schupp, H.T., Bradley, M.M., Birbaumer, N., Lang, P.J.: Brain potentials in affective picture processing: covariation with autonomic arousal and affective report. Biol. Psychol. 52(2), 95–111 (2000)
Keil, A., Bradley, M.M., Hauk, O., Rockstroh, B., Elbert, T., Lang, P.J.: Large-scale neural correlates of affective picture processing. Psychophysiology 39(5), 641–649 (2002)
Lang, P.J., Bradley, M.M., Fitzsimmons, J.R., Cuthbert, B.N., Scott, J.D., Moulder, B., Nangia, V.: Emotional arousal and activation of the visual cortex: an fMRI analysis. Psychophysiology 35(2), 199–210 (1998)
Bradley, M.M., Sabatinelli, D., Lang, P.J., Fitzsimmons, J.R., King, W., Desai, P.: Activation of the visual cortex in motivated attention. Behav. Neurosci. 117(2), 369–380 (2003)
Sabatinelli, D., Bradley, M.M., Fitzsimmons, J.R., Lang, P.J.: Parallel amygdala and inferotemporal activation reflect emotional intensity and fear relevance. Neuroimage. 24(4), 1265–1270 (2005). Epub 7 Jan 2005
Sabatinelli, D., Lang, P.J., Keil, A., Bradley, M.M.: Emotional perception: correlation of functional MRI and event-related potentials. Cereb. Cortex 17(5), 1085–1091 (2007). Epub 12 Jun 2006
Bradley, M.M., Codispoti, M., Lang, P.J.: A multi-process account of startle modulation during affective perception. Psychophysiology 43(5), 486–497 (2006)
Maker Shed: Pulse Sensor AMPED for Arduino. http://www.makershed.com/products/pulse-sensor-amped-for-arduino
Raspberry Pi. https://www.raspberrypi.org/
Muthusamy, R.P.: Emotion Recognition from Physiological signals using Bio-sensors. https://diuf.unifr.ch/main/diva/sites/diuf.unifr.ch.main.diva/files/T4.pdf - Submitted for Research Seminar on Emotion Recognition on 15 Feb 2012
Russel, J.A., Bullock, M.: Multidimensional scaling of emotional facial expressions. J. Pers. Soc. Psychol. 48, 1290–1298 (1985)
Hall, M., Frank, E., Holmes, G., Pfahringer, B., Reutemann, P., Witten, I.H.: The WEKA data mining software: an update. SIGKDD Explor. 11(1), 11–18 (2009)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer International Publishing Switzerland
About this paper
Cite this paper
Khan, A.M., Lawo, M. (2016). Recognizing Emotional States Using Physiological Devices. In: Stephanidis, C. (eds) HCI International 2016 – Posters' Extended Abstracts. HCI 2016. Communications in Computer and Information Science, vol 618. Springer, Cham. https://doi.org/10.1007/978-3-319-40542-1_26
Download citation
DOI: https://doi.org/10.1007/978-3-319-40542-1_26
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-40541-4
Online ISBN: 978-3-319-40542-1
eBook Packages: Computer ScienceComputer Science (R0)