A Novel DE-CNN-BiLSTM Multi-Fusion Model for EEG Emotion Recognition
<p>The framework of the proposed EEG-based emotion recognition model DE-CNN-BiLSTM.</p> "> Figure 2
<p>The spatial mapping of the DE features in four frequency bands.</p> "> Figure 3
<p>The spatial structure distribution of CNN model.</p> "> Figure 4
<p>The structure of the Bi-LSTM.</p> "> Figure 5
<p>(<b>a</b>)Topology changes in the DE at 10–60 s slices in four frequency bands at positive emotion. (<b>b</b>)Topology changes in the DE at 10–60 s slices in four frequency bands at negative emotion.</p> "> Figure 5 Cont.
<p>(<b>a</b>)Topology changes in the DE at 10–60 s slices in four frequency bands at positive emotion. (<b>b</b>)Topology changes in the DE at 10–60 s slices in four frequency bands at negative emotion.</p> "> Figure 6
<p>Training progress of the model in terms of training and validation accuracy for the emotional dimension of Valence.</p> "> Figure 7
<p>(<b>a</b>) Distribution of the emotion recognition accuracy of DEAP dataset on Valence and Arousal. (<b>b</b>) Distribution of the emotion recognition accuracy of SEED dataset.</p> ">
Abstract
:1. Introduction
2. Methods
- (1)
- The original EEG signals are decomposed into different frequency bands reflecting different states of brain and divided into different time slices.
- (2)
- We calculate the DE of all slices in different frequency bands, then map them into the brain spatial structure to obtain the 4D tensors.
- (3)
- We utilize CNN to pick up the detailed information of spatial structure and output a one-dimensional vector through the last layer of CNN.
- (4)
- The vectors are input to the Bi-LSTM to complete the prediction of the emotional state based on the past and future information of the time sequences.
- (5)
- The softmax function is used as the classifier of the model to output the recognition results.
2.1. Multi-Band Decomposition and DE Feature Spatial Mapping
2.2. Spatial Feature Learning
2.3. Temporal Feature Learning
3. Simulation and Result Analysis
3.1. Experimental Datasets
3.2. DE Feature Analysis
3.3. Result Analysis
4. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Yang, H.; Han, J.; Min, K. A Multi-Column CNN Model for Emotion Recognition from EEG Signals. J. Sens. 2019, 19, 4736. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Russell, J.A. A circumplex model of affect. J. Personal. Soc. Psychol. 1980, 39, 1161. [Google Scholar] [CrossRef]
- Gao, Q.; Wang, C.-H.; Wang, Z.; Song, X.-L.; Dong, E.-Z.; Song, Y. EEG based emotion recognition using fusion feature extraction method. J. Multimed. Tools Appl. 2020, 79, 27057–27074. [Google Scholar] [CrossRef]
- Padhmashree, V.; Bhattacharyya, A. Human emotion recognition based on time–frequency analysis of multivariate EEG signal. Knowl.-Based Syst. 2022, 238, 107867. [Google Scholar]
- Bhattacharyya, A.; Tripathy, R.K.; Garg, L.; Pachori, R.B. A novel multivariate-multiscale approach for computing EEG spectral and temporal complexity for human emotion recognition. IEEE Sens. J. 2020, 21, 3579–3591. [Google Scholar] [CrossRef]
- Bhattacharyya, A.; Singh, L.; Pachori, R.B. Fourier–Bessel series expansion based empirical wavelet transform for analysis of non-stationary signals. Digit. Signal Process. 2018, 78, 185–196. [Google Scholar] [CrossRef]
- Fang, J.; Wang, T.; Li, C.; Hu, X.; Ngai, E.; Seet, B.-C.; Cheng, J.; Guo, Y.; Jiang, X. Depression Prevalence in Postgraduate Students and Its Association With Gait Abnormality. IEEE Access 2019, 7, 174425–174437. [Google Scholar] [CrossRef]
- Sharma, L.D.; Bhattacharyya, A. A computerized approach for automatic human emotion recognition using sliding mode singular spectrum analysis. IEEE Sens. J. 2021, 21, 26931–26940. [Google Scholar] [CrossRef]
- Zhang, Y.; Chen, J.; Tan, J.H.; Chen, Y.; Chen, Y.; Li, D.; Yang, L.; Su, J.; Huang, X.; Che, W. An Investigation of Deep Learning Models for EEG-Based Emotion Recognition. Front. Neurosci. 2020, 14, 2759. [Google Scholar] [CrossRef]
- Jiang, H.; Wu, D.; Jiao, R.; Wang, Z. Analytical Comparison of Two Emotion Classification Models Based on Convolutional Neural Networks. Complex 2021, 2021, 6625141. [Google Scholar] [CrossRef]
- Zheng, W.L.; Lu, B.L. Investigating critical frequency bands and channels for EEG-based emotion recognition with deep neural networks. IEEE Trans. Auton. Ment. Dev. 2015, 7, 162–175. [Google Scholar] [CrossRef]
- Ozdemir, M.A.; Degirmenci, M.; Izci, E.; Akan, A. EEG-based emotion recognition with deep convolutional neural networks. Biomed. Eng. Biomed. Tech. 2021, 66, 43–57. [Google Scholar] [CrossRef]
- Zhang, Q.; Ding, J.; Kong, W.; Liu, Y.; Wang, Q.; Jiang, T. Epilepsy prediction through optimized multidimensional sample entropy and Bi-LSTM. J. Biomed. Signal Process. Control 2021, 64, 102293. [Google Scholar] [CrossRef]
- Li, Y.; Wong, C.M.; Zheng, Y.; Wan, F.; Mak, P.U.; Pun, S.H.; I Vai, M. EEG-based emotion recognition under convolutional neural network with differential entropy feature maps. In Proceedings of the 2019 IEEE International Conference on Computational Intelligence and Virtual Environments for Measurement Systems and Applications (CIVEMSA), Tianjin, China, 14–16 June 2019; pp. 1–5. [Google Scholar]
- Topic, A.; Russo, M. Emotion recognition based on EEG feature maps through deep learning network. J. Eng. Sci. Technol. Int. 2021, 24, 1442–1454. [Google Scholar] [CrossRef]
- Alarcao, S.M.; Fonseca, M.J. Emotions recognition using EEG signals: A. survey. J. IEEE Trans. Affect. Comput. 2017, 10, 374–393. [Google Scholar] [CrossRef]
- Peters, J.M.; Taquet, M.; Vega, C.; Jeste, S.S.; Fernández, I.S.; Tan, J.; A Nelson, C.; Sahin, M.; Warfield, S.K. Brain functional networks in syndromic and non-syndromic autism: A graph theoretical study of EEG connectivity. J. BMC Med. 2013, 11, 54. [Google Scholar] [CrossRef]
- Bhavsar, R.; Sun, Y.; Helian, N.; Davey, N.; Mayor, D.; Steffert, T. The Correlation between EEG Signals as Measured in Different Positions on Scalp Varying with Distance. J. Procedia Comput. Sci. 2018, 123, 92–97. [Google Scholar] [CrossRef]
- Hwang, S.; Hong, K.; Son, G.; Byun, H. Learning CNN features from DE features for EEG-based emotion recognition. J. Pattern Anal. Appl. 2020, 23, 1323–1335. [Google Scholar] [CrossRef]
- Yang, Y.; Wu, Q.; Fu, Y.; Chen, X. Continuous convolutional neural network with 3d input for eeg-based emotion recognition. In Proceedings of the International Conference on Neural Information Processing, Siem Reap, Cambodia, 13–16 December 2018; Springer: Cham, Switzerland, 2018; pp. 433–443. [Google Scholar]
- Shen, F.; Dai, G.; Lin, G.; Zhang, J. EEG-based emotion recognition using 4D convolutional recurrent neural network. J. Cogn. Neurodyn. 2020, 14, 815–828. [Google Scholar] [CrossRef]
- Day, M.J.; Horzinek, M.C.; Schultz, R.D. Compiled by the vaccination guidelines group (VGG) of the world small animal veterinary association (WSAVA). J. Small Anim. Pract. 2007, 48, 528. [Google Scholar] [CrossRef]
- Chen, X.; He, J.; Wu, X.; Yan, W.; Wei, W. Sleep staging by bidirectional long short-term memory convolution neural network. J. Future Gener. Comput. Syst. 2020, 109, 188–196. [Google Scholar] [CrossRef]
- Yu, Y.; Si, X.; Hu, C.; Zhang, J. Review of Recurrent Neural Networks: LSTM Cells and Network Architectures. J. Neural Comput. 2019, 31, 1235–1270. [Google Scholar] [CrossRef] [PubMed]
- Graves, A.; Mohamed, A.; Hinton, G. Speech recognition with deep recurrent neural networks. In Proceedings of the 2013 IEEE International Conference on Acoustics, Speech and Signal Processing, Vancouver, BC, Canada, 26–31 May 2013; pp. 6645–6649. [Google Scholar]
- Zheng, X.; Chen, W. An Attention-based Bi-LSTM Method for Visual Object Classification via EEG. J. Biomed. Signal Process. Control 2021, 63, 102174. [Google Scholar] [CrossRef]
- Koelstra, S.; Muhl, C.; Soleymani, M.; Lee, J.-S.; Yazdani, A.; Ebrahimi, T.; Pun, T.; Nijholt, A.; Patras, I. Deap: A database for emotion analysis; using physiological signals. J. IEEE Trans. Affect. Comput. 2011, 3, 18–31. [Google Scholar] [CrossRef] [Green Version]
- Huang, D.; Chen, S.; Liu, C.; Zheng, L.; Tian, Z.; Jiang, D. Differences First in Asymmetric Brain: A Bi-hemisphere Discrepancy Convolutional Neural Network for EEG Emotion Recognition. J. Neurocomput. 2021, 448, 140–151. [Google Scholar] [CrossRef]
- Wang, X.W.; Nie, D.; Lu, B.L. Emotional state classification from EEG data using machine learning approach. J. Neurocomput. 2014, 129, 94–106. [Google Scholar] [CrossRef]
- Thammasan, N.; Moriyama, K.; Fukui, K.-I.; Numao, M. Familiarity effects in EEG-based emotion recognition. J. Brain Inform. 2017, 4, 39–50. [Google Scholar] [CrossRef]
- Mert, A.; Akan, A. Emotion recognition from EEG signals by using multivariate empirical mode decomposition. J. Pattern Anal. Appl. 2018, 21, 81–89. [Google Scholar] [CrossRef]
- Joshi, V.M.; Ghongade, R.B. EEG based emotion detection using fourth order spectral moment and deep learning. J. Biomed. Signal Process. Control 2021, 68, 102755. [Google Scholar] [CrossRef]
- Li, J.; Zhang, Z.; He, H. Hierarchical convolutional neural networks for EEG-based emotion recognition. J. Cogn. Comput. 2018, 10, 368–380. [Google Scholar] [CrossRef]
- Yea-Hoon, K.; Sae-Byuk, S.; Shin-Dug, K. Electroencephalography Based Fusion Two-Dimensional (2D)-Convolution Neural Networks (CNN) Model for Emotion Recognition System. J. Sens. 2018, 18, 1383. [Google Scholar]
- Zhu, Y.; Zhong, Q. Differential Entropy Feature Signal Extraction Based on Activation Mode and Its Recognition in Convolutional Gated Recurrent Unit Network. Front. Phys. 2021, 8, 9620. [Google Scholar] [CrossRef]
- Chang, Q.; Li, C.; Tian, Q.; Bo, Q.; Zhang, J.; Xiong, Y.; Wang, C. Classification of First-Episode Schizophrenia, Chronic Schizophrenia and Healthy Control Based on Brain Network of Mismatch Negativity by Graph Neural Network. IEEE Trans. Neural Syst. Rehabil. Eng. 2021, 29, 1784–1794. [Google Scholar] [CrossRef]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Cui, F.; Wang, R.; Ding, W.; Chen, Y.; Huang, L. A Novel DE-CNN-BiLSTM Multi-Fusion Model for EEG Emotion Recognition. Mathematics 2022, 10, 582. https://doi.org/10.3390/math10040582
Cui F, Wang R, Ding W, Chen Y, Huang L. A Novel DE-CNN-BiLSTM Multi-Fusion Model for EEG Emotion Recognition. Mathematics. 2022; 10(4):582. https://doi.org/10.3390/math10040582
Chicago/Turabian StyleCui, Fachang, Ruqing Wang, Weiwei Ding, Yao Chen, and Liya Huang. 2022. "A Novel DE-CNN-BiLSTM Multi-Fusion Model for EEG Emotion Recognition" Mathematics 10, no. 4: 582. https://doi.org/10.3390/math10040582