Nothing Special   »   [go: up one dir, main page]

skip to main content
research-article

Identifying Stable Patterns over Time for Emotion Recognition from EEG

Published: 01 July 2019 Publication History

Abstract

In this paper, we investigate stable patterns of electroencephalogram (EEG) over time for emotion recognition using a machine learning approach. Up to now, various findings of activated patterns associated with different emotions have been reported. However, their stability over time has not been fully investigated yet. In this paper, we focus on identifying EEG stability in emotion recognition. We systematically evaluate the performance of various popular feature extraction, feature selection, feature smoothing and pattern classification methods with the DEAP dataset and a newly developed dataset called SEED for this study. Discriminative Graph regularized Extreme Learning Machine with differential entropy features achieves the best average accuracies of 69.67 and 91.07 percent on the DEAP and SEED datasets, respectively. The experimental results indicate that stable patterns exhibit consistency across sessions; the lateral temporal areas activate more for positive emotions than negative emotions in beta and gamma bands; the neural patterns of neutral emotions have higher alpha responses at parietal and occipital sites; and for negative emotions, the neural patterns have significant higher delta responses at parietal and occipital sites and higher gamma responses at prefrontal sites. The performance of our emotion recognition models shows that the neural patterns are relatively stable within and between sessions.

References

[1]
A. Etkin, C. Büchel, and J. J. Gross, “The neural bases of emotion regulation,” Nature Rev. Neuroscience, vol. 16, no. 11, pp. 693–700, 2015.
[2]
F. Nijboer, F. O. Morin, S. P. Carmien, R. A. Koene, E. Leon, and U. Hoffmann, “Affective brain-computer interfaces: Psychophysiological markers of emotion in healthy persons and in persons with amyotrophic lateral sclerosis,” in Proc. Int. Conf. Affect. Comput. Intell. Interaction Workshops, 2009, pp. 1–11.
[3]
C. Mühl, B. Allison, A. Nijholt, and G. Chanel, “A survey of affective brain computer interfaces: Principles, state-of-the-art, and challenges,” Brain-Comput. Interfaces, vol. 1, no. 2, pp. 66–84, 2014.
[4]
S. K. D'mello and J. Kory, “A review and meta-analysis of multimodal affect detection systems,” ACM Comput. Surveys, vol. 47, no. 3, 2015, Art. no. 43.
[5]
J. Kim and E. André, “Emotion recognition based on physiological changes in music listening,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 30, no. 12, pp. 2067–2083, Dec. 2008.
[6]
L. McEvoy, M. Smith, and A. Gevins, “Test–retest reliability of cognitive EEG,” Clinical Neurophysiology, vol. 111, no. 3, pp. 457–463, 2000.
[7]
D. O. Bos, “EEG-based emotion recognition - The Influence of Visual and Auditory Stimuli,” pp. 1–17, University of Twente, 2006.
[8]
A. Heraz and C. Frasson, “Predicting the three major dimensions of the learner's emotions from brainwaves,” Int. J. Comput. Sci., vol. 2, no. 3, pp. 187–193, 2007.
[9]
M. Murugappan, N. Ramachandran, and Y. Sazali, “Classification of human emotion from EEG using discrete wavelet transform,” J. Biomed. Sci. Eng., vol. 2, no. 4, pp. 390–396, 2010.
[10]
Y.-P. Lin, et al., “EEG-based emotion recognition in music listening,” IEEE Trans. Biomed. Eng., vol. 57, no. 7, pp. 1798–1806, Jul. 2010.
[11]
L. Brown, B. Grundlehner, and J. Penders, “Towards wireless emotional valence detection from EEG,” in Proc. Annu. Int. Conf. IEEE Eng. Med. Biol. Soc., 2011, pp. 2188–2191.
[12]
P. C. Petrantonakis and L. J. Hadjileontiadis, “A novel emotion elicitation index using frontal brain asymmetry for enhanced EEG-based emotion recognition,” IEEE Trans. Inf. Technol. Biomed., vol. 15, no. 5, pp. 737–746, Sep. 2011.
[13]
S. Koelstra, et al., “DEAP: A database for emotion analysis using physiological signals,” IEEE Trans. Affect. Comput., vol. 3, no. 1, pp. 18–31, Jan.–Mar. 2012.
[14]
S. K. Hadjidimitriou and L. J. Hadjileontiadis, “Toward an EEG-based recognition of music liking using time-frequency analysis,” IEEE Trans. Biomed. Eng., vol. 59, no. 12, pp. 3498–3510, Dec. 2012.
[15]
M. Soleymani, J. Lichtenauer, T. Pun, and M. Pantic, “A multimodal database for affect recognition and implicit tagging,” IEEE Trans. Affect. Comput., vol. 3, no. 1, pp. 42–55, Jan.–Mar. 2012.
[16]
X.-W. Wang, D. Nie, and B.-L. Lu, “Emotional state classification from EEG data using machine learning approach,” Neurocomputing, vol. 129, pp. 94–106, 2014.
[17]
R. Jenke, A. Peer, and M. Buss, “Feature extraction and selection for emotion recognition from EEG,” IEEE Trans. Affect. Comput., vol. 5, no. 3, pp. 327–339, Jul-Sep. 2014.
[18]
I. Daly, et al., “Neural correlates of emotional responses to music: An EEG study,” Neuroscience Lett., vol. 573, pp. 52–57, 2014.
[19]
R. A. Calvo and S. D'Mello, “Affect detection: An interdisciplinary review of models, methods, and their applications,” IEEE Trans. Affect. Comput., vol. 1, no. 1, pp. 18–37, Jan.-Jun. 2010.
[20]
C.-T. Lin, L.-D. Liao, Y.-H. Liu, I.-J. Wang, B.-S. Lin, and J.-Y. Chang, “Novel dry polymer foam electrodes for long-term EEG measurement,” IEEE Trans. Biomed. Eng., vol. 58, no. 5, pp. 1200–1207, May 2011.
[21]
C. Grozea, C. D. Voinescu, and S. Fazli, “Bristle-sensors–low-cost flexible passive dry EEG electrodes for neurofeedback and BCI applications,” J. Neural Eng., vol. 8, no. 2, 2011, Art. no. 025008.
[22]
M.-K. Kim, M. Kim, E. Oh, and S.-P. Kim, “A review on the computational methods for emotional state estimation from the human EEG,” Comput. Math. Methods Med., vol. 2013, 2013, Art. no. 573734.
[23]
K. S. Kassam, A. R. Markey, V. L. Cherkassky, G. Loewenstein, and M. A. Just, “Identifying emotions on the basis of neural activation,” PloS One, vol. 8, no. 6, 2013, Art. no. e66032.
[24]
I. B. Mauss and M. D. Robinson, “Measures of emotion: A review,” Cognition Emotion, vol. 23, no. 2, pp. 209–237, 2009.
[25]
L. A. Schmidt and L. J. Trainor, “Frontal brain electrical activity (EEG) distinguishes valence and intensity of musical emotions,” Cognition Emotion, vol. 15, no. 4, pp. 487–500, 2001.
[26]
M. M. Müller, A. Keil, T. Gruber, and T. Elbert, “Processing of affective pictures modulates right-hemispheric gamma band EEG activity,” Clinical Neurophysiology, vol. 110, no. 11, pp. 1913–1920, 1999.
[27]
R. J. Davidson and N. A. Fox, “Asymmetrical brain activity discriminates between positive and negative affective stimuli in human infants,” Science, vol. 218, no. 4578, pp. 1235–1237, 1982.
[28]
R. J. Davidson, “Anterior cerebral asymmetry and the nature of emotion,” Brain Cognition, vol. 20, no. 1, pp. 125–151, 1992.
[29]
D. Nie, X.-W. Wang, L.-C. Shi, and B.-L. Lu, “EEG-based emotion recognition during watching movies,” in Proc. 5th Int. IEEE/EMBS Conf. Neural Eng., 2011, pp. 667–670.
[30]
M. Balconi, E. Brambilla, and L. Falbo, “Appetitive versus defensive responses to emotional cues. autonomic measures and brain oscillation modulation,” Brain Res., vol. 1296, pp. 72–84, 2009.
[31]
M. Salinsky, B. Oken, and L. Morehead, “Test-retest reliability in EEG frequency analysis,” Electroencephalography Clinical Neurophysiology, vol. 79, no. 5, pp. 382–392, 1991.
[32]
S. Gudmundsson, T. P. Runarsson, S. Sigurdsson, G. Eiriksdottir, and K. Johnsen, “Reliability of quantitative EEG features,” Clinical Neurophysiology, vol. 118, no. 10, pp. 2162–2171, 2007.
[33]
J. J. Allen, H. L. Urry, S. K. Hitt, and J. A. Coan, “The stability of resting frontal electroencephalographic asymmetry in depression,” Psychophysiology, vol. 41, no. 2, pp. 269–280, 2004.
[34]
Z. Lan, O. Sourina, L. Wang, and Y. Liu, “Stability of features in real-time EEG-based emotion recognition algorithm,” in Proc. Int. Conf. Cyberworlds, Oct. 2014, pp. 137–144.
[35]
J. J. Gross, “Handbook of emotion regulation,” 2013.
[36]
W.-L. Zheng, B.-N. Dong, and B.-L. Lu, “Multimodal emotion recognition using EEG and eye tracking data,” in Proc. 36th Annu. Int. Conf. IEEE Eng. Med. Biol. Soc., 2014, pp. 5040–5043.
[37]
S.-Y. Wu, M. Schaefer, W.-L. Zheng, B.-L. Lu, and H. Yokoi, “Neural patterns between Chinese and German for EEG-based emotion recognition,” to appear in Proc. 8th Int. IEEE EMBS Conf. Neural Eng., 2017.
[38]
P. Philippot, “Inducing and assessing differentiated emotion-feeling states in the laboratory,” Cognition Emotion, vol. 7, no. 2, pp. 171–193, 1993.
[39]
A. Schaefer, F. Nils, X. Sanchez, and P. Philippot, “Assessing the effectiveness of a large database of emotion-eliciting films: A new tool for emotion researchers,” Cognition Emotion, vol. 24, no. 7, pp. 1153–1172, 2010.
[40]
L.-C. Shi, Y.-Y. Jiao, and B.-L. Lu, “Differential entropy feature for EEG-based vigilance estimation,” in Proc. 35th Annu. Int. Conf. IEEE Eng. Med. Biol. Soc., 2013, pp. 6627–6630.
[41]
W.-L. Zheng, J.-Y. Zhu, Y. Peng, and B.-L. Lu, “EEG-based emotion classification using deep belief networks,” in Proc. IEEE Int. Conf. Multimedia Expo, Jul. 2014, pp. 1–6.
[42]
W.-L. Zheng and B.-L. Lu, “Investigating critical frequency bands and channels for EEG-based emotion recognition with deep neural networks,” IEEE Trans. Auton. Mental Develop., vol. 7, no. 3, pp. 162–175, Sep. 2015.
[43]
Y.-P. Lin, Y.-H. Yang, and T.-P. Jung, “Fusion of electroencephalographic dynamics and musical contents for estimating emotional responses in music listening,” Frontiers Neuroscience, vol. 8, 2014, Art. no. 94.
[44]
J. R. Fontaine, K. R. Scherer, E. B. Roesch, and P. C. Ellsworth, “The world of emotions is not two-dimensional,” Psychological Sci., vol. 18, no. 12, pp. 1050–1057, 2007.
[45]
L.-C. Shi and B.-L. Lu, “Off-line and on-line vigilance estimation based on linear dynamical system and manifold learning,” in Proc. Annu. Int. Conf. IEEE Eng. Med. Biol. Soc., 2010, pp. 6587–6590.
[46]
R.-N. Duan, X.-W. Wang, and B.-L. Lu, “EEG-based emotion recognition in listening music by using support vector machine and linear dynamic system,” in Proc. Neural Inf. Process., 2012, pp. 468–475.
[47]
R. H. Shumway and D. S. Stoffer, Time Series Analysis and Its Applications: With R Examples. Berlin, Germany: Springer, 2010.
[48]
A. K. Jain, R. P. W. Duin, and J. Mao, “Statistical pattern recognition: A review,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 22, no. 1, pp. 4–37, Jan. 2000.
[49]
H. Peng, F. Long, and C. Ding, “Feature selection based on mutual information criteria of max-dependency, max-relevance, and min-redundancy,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 27, no. 8, pp. 1226–1238, Aug. 2005.
[50]
Y. Peng, S. Wang, X. Long, and B.-L. Lu, “Discriminative graph regularized extreme learning machine and its application to face recognition,” Neurocomputing, vol. 149, pp. 340–353, 2015.
[51]
R.-E. Fan, K.-W. Chang, C.-J. Hsieh, X.-R. Wang, and C.-J. Lin, “LIBLINEAR: A library for large linear classification,” J. Mach. Learn. Res., vol. 9, pp. 1871–1874, 2008.
[52]
G.-B. Huang, Q.-Y. Zhu, and C.-K. Siew, “Extreme learning machine: Theory and applications,” Neurocomputing, vol. 70, no. 1, pp. 489–501, 2006.
[53]
S. Y. Chung and H. J. Yoon, “Affective classification using Bayesian classifier and supervised learning,” in Proc. 12th Int. Conf. Control Autom. Syst., 2012, pp. 1768–1771.
[54]
Y. Liu and O. Sourina, “Real-time fractal-based valence level recognition from EEG,” in Transactions on Computational Science XVIII. Berlin, Germany: Springer, 2013, pp. 101–120.
[55]
X. Zhang, B. Hu, J. Chen, and P. Moore, “Ontology-based context modeling for emotion recognition in an intelligent Web,” World Wide Web, vol. 16, no. 4, pp. 497–513, 2013.
[56]
M. Li and B.-L. Lu, “Emotion classification based on gamma-band EEG,” in Proc. Annu. Int. Conf. IEEE Eng. Med. Biol. Soc., 2009, pp. 1223–1226.
[57]
B. Güntekin and E. Başar, “Event-related beta oscillations are affected by emotional eliciting stimuli,” Neuroscience Lett., vol. 483, no. 3, pp. 173–178, 2010.
[58]
N. Martini, et al., “The dynamics of EEG gamma responses to unpleasant visual stimuli: From local activity to functional connectivity,” NeuroImage, vol. 60, no. 2, pp. 922–932, 2012.
[59]
W. J. Ray and H. W. Cole, “EEG alpha activity reflects attentional demands, and beta activity reflects emotional and cognitive processes,” Science, vol. 228, no. 4700, pp. 750–752, 1985.
[60]
W. Klimesch, M. Doppelmayr, H. Russegger, T. Pachinger, and J. Schwaiger, “Induced alpha band power changes in the human EEG and attention,” Neuroscience Lett., vol. 244, no. 2, pp. 73–76, 1998.
[61]
M. Balconi and C. Lucchiari, “Consciousness and arousal effects on emotional face processing as revealed by brain oscillations. A gamma band analysis,” Int. J. Psychophysiology, vol. 67, no. 1, pp. 41–46, 2008.
[62]
E. Olivetti, S. M. Kia, and P. Avesani, “MEG decoding across subjects,” Int. Workshop Pattern Recog. Neuroimaging, pp. 1–4, 2014.
[63]
S. J. Pan and Q. Yang, “A survey on transfer learning,” IEEE Trans. Knowl. Data Eng., vol. 22, no. 10, pp. 1345–1359, Oct. 2010.
[64]
W. Samek, F. C. Meinecke, and K.-R. Muller, “Transferring subspaces between subjects in brain–computer interfacing,” IEEE Trans. Biomed. Eng., vol. 60, no. 8, pp. 2289–2298, Aug. 2013.
[65]
W.-L. Zheng, Y.-Q. Zhang, J.-Y. Zhu, and B.-L. Lu, “Transfer components between subjects for EEG-based emotion recognition,” in Proc. Int. Conf. Affect. Comput. Intell. Interaction, 2015, pp. 917–922.
[66]
W.-L. Zheng and B.-L. Lu, “Personalizing EEG-based affective models with transfer learning,” in Proc. 25th Int. Joint Conf. Artif. Intell., 2016, pp. 2732–2738.

Cited By

View all
  • (2024)Research Progress of EEG-Based Emotion Recognition: A SurveyACM Computing Surveys10.1145/366600256:11(1-49)Online publication date: 8-Jul-2024
  • (2024)Correlation-Driven Multi-Modality Graph Decomposition for Cross-Subject Emotion RecognitionProceedings of the 32nd ACM International Conference on Multimedia10.1145/3664647.3681579(2272-2281)Online publication date: 28-Oct-2024
  • (2024)Efficient Decoding of Affective States from Video-elicited EEG Signals: An Empirical InvestigationACM Transactions on Multimedia Computing, Communications, and Applications10.1145/366366920:10(1-24)Online publication date: 12-Sep-2024
  • Show More Cited By

Index Terms

  1. Identifying Stable Patterns over Time for Emotion Recognition from EEG
          Index terms have been assigned to the content through auto-classification.

          Recommendations

          Comments

          Please enable JavaScript to view thecomments powered by Disqus.

          Information & Contributors

          Information

          Published In

          cover image IEEE Transactions on Affective Computing
          IEEE Transactions on Affective Computing  Volume 10, Issue 3
          July-Sept. 2019
          148 pages

          Publisher

          IEEE Computer Society Press

          Washington, DC, United States

          Publication History

          Published: 01 July 2019

          Qualifiers

          • Research-article

          Contributors

          Other Metrics

          Bibliometrics & Citations

          Bibliometrics

          Article Metrics

          • Downloads (Last 12 months)0
          • Downloads (Last 6 weeks)0
          Reflects downloads up to 30 Nov 2024

          Other Metrics

          Citations

          Cited By

          View all
          • (2024)Research Progress of EEG-Based Emotion Recognition: A SurveyACM Computing Surveys10.1145/366600256:11(1-49)Online publication date: 8-Jul-2024
          • (2024)Correlation-Driven Multi-Modality Graph Decomposition for Cross-Subject Emotion RecognitionProceedings of the 32nd ACM International Conference on Multimedia10.1145/3664647.3681579(2272-2281)Online publication date: 28-Oct-2024
          • (2024)Efficient Decoding of Affective States from Video-elicited EEG Signals: An Empirical InvestigationACM Transactions on Multimedia Computing, Communications, and Applications10.1145/366366920:10(1-24)Online publication date: 12-Sep-2024
          • (2024)SynCocreate: Fostering Interpersonal Connectedness via Brainwave-Driven Co-creation in Virtual RealityExtended Abstracts of the CHI Conference on Human Factors in Computing Systems10.1145/3613905.3648650(1-5)Online publication date: 11-May-2024
          • (2024)PGCN: Pyramidal Graph Convolutional Network for EEG Emotion RecognitionIEEE Transactions on Multimedia10.1109/TMM.2024.338567626(9070-9082)Online publication date: 10-Apr-2024
          • (2024)Bridge Graph Attention Based Graph Convolution Network With Multi-Scale Transformer for EEG Emotion RecognitionIEEE Transactions on Affective Computing10.1109/TAFFC.2024.339487315:4(2042-2054)Online publication date: 30-Apr-2024
          • (2024)CiABL: Completeness-Induced Adaptative Broad Learning for Cross-Subject Emotion Recognition With EEG and Eye Movement SignalsIEEE Transactions on Affective Computing10.1109/TAFFC.2024.339279115:4(1970-1984)Online publication date: 23-Apr-2024
          • (2024)FBSTCNet: A Spatio-Temporal Convolutional Network Integrating Power and Connectivity Features for EEG-Based Emotion DecodingIEEE Transactions on Affective Computing10.1109/TAFFC.2024.338565115:4(1906-1918)Online publication date: 5-Apr-2024
          • (2024)GDDN: Graph Domain Disentanglement Network for Generalizable EEG Emotion RecognitionIEEE Transactions on Affective Computing10.1109/TAFFC.2024.337154015:3(1739-1753)Online publication date: 1-Jul-2024
          • (2024)An Affective Brain-Computer Interface Based on a Transfer Learning MethodIEEE Transactions on Affective Computing10.1109/TAFFC.2023.330598215:3(929-941)Online publication date: 1-Jul-2024
          • Show More Cited By

          View Options

          View options

          Login options

          Media

          Figures

          Other

          Tables

          Share

          Share

          Share this Publication link

          Share on social media