Nothing Special   »   [go: up one dir, main page]

skip to main content
research-article

AMDET: Attention Based Multiple Dimensions EEG Transformer for Emotion Recognition

Published: 22 September 2023 Publication History

Abstract

Affective computing is an important subfield of artificial intelligence, and with the rapid development of brain-computer interface technology, emotion recognition based on EEG signals has received broad attention. It is still a great challenge to effectively explore the multi-dimensional information in the EEG data in spite of a large number of deep learning methods. In this article, we propose a deep learning model called Attention-based Multiple Dimensions EEG Transformer (AMDET), which can leverage the complementarity among the spectral-spatial-temporal features of EEG data by employing the multi-dimensional global attention mechanism. We first transform the original EEG data into 3D temporal-spectral-spatial representations and then the AMDET would use spectral-spatial transformer blocks to extract effective features in the EEG signal and focus on the critical time frame with the temporal attention block. We conduct extensive experiments on the DEAP, SEED, and SEED-IV datasets to evaluate the performance of AMDET and the results outperform the state-of-the-art baseline on three datasets. Accuracy rates of 97.48%, 96.85%, 97.17%, 87.32% were achieved in the DEAP-Arousal, DEAP-Valence, SEED, and SEED-IV datasets, respectively. Based on AMDET, we can achieve over 90% accuracy with only eight channels, significantly improving the possibility of practical applications.

References

[1]
P. Tarnowski, M. Kołodziej, A. Majkowski, and R. J. Rak, “Emotion recognition using facial expressions,” Procedia Comput. Sci., vol. 108, pp. 1175–1184, 2017.
[2]
F. Noroozi, C. A. Corneanu, D. Kamińska, T. Sapiński, S. Escalera, and G. Anbarjafari, “Survey on emotional body gesture recognition,” IEEE Trans. Affect. Comput., vol. 12, no. 2, pp. 505–523, Second Quarter 2018.
[3]
D. Alu, E. Zoltan, and I. C. Stoica, “Voice based emotion recognition with convolutional neural networks for companion robots,” Sci. Technol., vol. 20, no. 3, pp. 222–240, 2017.
[4]
A. Dzedzickis, A. Kaklauskas, and V. Bucinskas, “Human emotion recognition: Review of sensors and methods,” Sensors, vol. 20, no. 3, 2020, Art. no.
[5]
S. Aydın and L. Onbaşı, “Graph theoretical brain connectivity measures to investigate neural correlates of music rhythms associated with fear and anger,” Cogn. Neurodynamics, pp. 1–18, 2023.
[6]
B. Kılıç and S. Aydın, “Classification of contrasting discrete emotional states indicated by eeg based graph theoretical network measures,” Neuroinformatics, vol. 20, no. 4, pp. 863–877, 2022.
[7]
L. F. Barrett, B. Mesquita, K. N. Ochsner, and J. J. Gross, “The experience of emotion,” Annu. Rev. Psychol., vol. 58, pp. 373–403, 2007.
[8]
B. He, L. Ding, and A. Sohrabpour, “Electrophysiological mapping and source imaging,” in Neural Engineering. Berlin, Germany: Springer, 2020, pp. 379–413.
[9]
A. Biasiucci, B. Franceschiello, and M. M. Murray, “Electroencephalography,” Curr. Biol., vol. 29, no. 3, pp. R80–R85, 2019.
[10]
J. Wang and M. Wang, “Review of the emotional feature extraction and classification using eeg signals,” Cogn. Robot., vol. 1, pp. 29–40, 2021.
[11]
T. Murata et al., “Quantitative EEG study on zen meditation (zazen),” Psychiatry Clin. Neurosci., vol. 48, no. 4, pp. 881–890, 1994.
[12]
W. J. Ray and H. W. Cole, “EEG alpha activity reflects attentional demands, and beta activity reflects emotional and cognitive processes,” Science, vol. 228, no. 4700, pp. 750–752, 1985.
[13]
D. Sammler, M. Grigutsch, T. Fritz, and S. Koelsch, “Music and emotion: Electrophysiological correlates of the processing of pleasant and unpleasant music,” Psychophysiol., vol. 44, no. 2, pp. 293–304, 2007.
[14]
Z. Yin, L. Liu, J. Chen, B. Zhao, and Y. Wang, “Locally robust EEG feature selection for individual-independent emotion recognition,” Expert Syst. Appl., vol. 162, 2020, Art. no.
[15]
R.-N. Duan, J.-Y. Zhu, and B.-L. Lu, “Differential entropy feature for EEG-based emotion classification,” in Proc. IEEE/EMBS 6th Int. Conf. Neural Eng., 2013, pp. 81–84.
[16]
T. Song, W. Zheng, P. Song, and Z. Cui, “EEG emotion recognition using dynamical graph convolutional neural networks,” IEEE Trans. Affect. Comput., vol. 11, no. 3, pp. 532–541, Third Quarter 2020.
[17]
K. Cao, “The research of the eeg frequency power feature in three basic emotions,” Tianjin, China: Tianjin Med. Univ, 2019.
[18]
P. Ekman and R. J. Davidson, “Voluntary smiling changes regional brain activity,” Psychol. Sci., vol. 4, no. 5, pp. 342–345, 1993.
[19]
A. M. Bhatti, M. Majid, S. M. Anwar, and B. Khan, “Human emotion recognition and analysis in response to audio music using brain signals,” Comput. Hum. Behav., vol. 65, pp. 267–275, 2016.
[20]
P. Li et al., “EEG based emotion recognition by combining functional connectivity network and local activations,” IEEE Trans. Biomed. Eng., vol. 66, no. 10, pp. 2869–2881, Oct. 2019.
[21]
X. Zheng, X. Liu, Y. Zhang, L. Cui, and X. Yu, “A portable HCI system-oriented EEG feature extraction and channel selection for emotion recognition,” Int. J. Intell. Syst., vol. 36, no. 1, pp. 152–176, 2021.
[22]
N. F. Güler, E. D. Übeyli, and I. Güler, “Recurrent neural networks employing lyapunov exponents for EEG signals classification,” Expert Syst. Appl., vol. 29, no. 3, pp. 506–514, 2005.
[23]
J. Atkinson and D. Campos, “Improving BCI-based emotion recognition by combining EEG feature selection and kernel classifiers,” Expert Syst. Appl., vol. 47, pp. 35–41, 2016.
[24]
J. Shen, X. Zhang, G. Wang, Z. Ding, and B. Hu, “An improved empirical mode decomposition of electroencephalogram signals for depression detection,” IEEE Trans. Affect. Comput., vol. 13, no. 1, pp. 262–271, First Quarter 2022.
[25]
A. Subasi, “EEG signal classification using wavelet feature extraction and a mixture of expert model,” Expert Syst. Appl., vol. 32, no. 4, pp. 1084–1093, 2007.
[26]
M. Li, H. Xu, X. Liu, and S. Lu, “Emotion recognition from multichannel EEG signals using k-nearest neighbor classification,” Technol. Health Care, vol. 26, no. S1, pp. 509–519, 2018.
[27]
A. Subasi, T. Tuncer, S. Dogan, D. Tanko, and U. Sakoglu, “EEG-based emotion recognition using tunable Q wavelet transform and rotation forest ensemble classifier,” Biomed. Signal Process. Control, vol. 68, 2021, Art. no.
[28]
Y. Du, Y. Xu, X. Wang, L. Liu, and P. Ma, “EEG temporal–spatial transformer for person identification,” Sci. Rep., vol. 12, no. 1, pp. 1–10, 2022.
[29]
J. Li, Z. Zhang, and H. He, “Hierarchical convolutional neural networks for EEG-based emotion recognition,” Cogn. Comput., vol. 10, no. 2, pp. 368–380, 2018.
[30]
S. Alhagry, A. A. Fahmy, and R. A. El-Khoribi, “Emotion recognition based on EEG using LSTM recurrent neural network,” Int. J. Adv. Comput. Sci. Appl., vol. 8, no. 10, pp. 355–358, 2017.
[31]
W. Tao et al., “EEG-based emotion recognition via channel-wise attention and self attention,” IEEE Trans. Affect. Comput., vol. 14, no. 1, pp. 382–393, First Quarter 2023.
[32]
Z. Jia, Y. Lin, X. Cai, H. Chen, H. Gou, and J. Wang, “SST-emotionnet: Spatial-spectral-temporal based attention 3D dense network for eeg emotion recognition,” in Proc. 28th ACM Int. Conf. Multimedia, 2020, pp. 2909–2917.
[33]
G. Xiao, M. Shi, M. Ye, B. Xu, Z. Chen, and Q. Ren, “4D attention-based neural network for EEG emotion recognition,” Cogn. Neurodynamics, vol. 16, pp. 805–818, 2022.
[34]
X. Zhang et al., “Emotion recognition from multimodal physiological signals using a regularized deep fusion of kernel machine,” IEEE Trans. Cybern., vol. 51, no. 9, pp. 4386–4399, Sep. 2021.
[35]
M. Xia, J. Wang, and Y. He, “Brainnet viewer: A network visualization tool for human brain connectomics,” PLoS One, vol. 8, no. 7, 2013, Art. no.
[36]
A. Vaswani et al., “Attention is all you need,” in Proc. Adv. Neural Inf. Process. Syst., 2017, pp. 6000–6010.
[37]
M. Li and B.-L. Lu, “Emotion classification based on gamma-band EEG,” in Proc. IEEE Annu. Int. Conf. Eng. Med. Biol. Soc., 2009, pp. 1223–1226.
[38]
W.-L. Zheng and B.-L. Lu, “Investigating critical frequency bands and channels for EEG-based emotion recognition with deep neural networks,” IEEE Trans. Auton. Ment. Develop., vol. 7, no. 3, pp. 162–175, Sep. 2015.
[39]
K. He, X. Zhang, S. Ren, and J. Sun, “Deep residual learning for image recognition,” in Proc. IEEE Conf. Comput. Vis. Pattern Recognit., 2016, pp. 770–778.
[40]
J. L. Ba, J. R. Kiros, and G. E. Hinton, “Layer normalization,” 2016,.
[41]
S. Koelstra et al., “DEAP: A database for emotion analysis; using physiological signals,” IEEE Trans. Affect. Comput., vol. 3, no. 1, pp. 18–31, First Quarter 2011.
[42]
W.-L. Zheng, W. Liu, Y. Lu, B.-L. Lu, and A. Cichocki, “Emotionmeter: A multimodal framework for recognizing human emotions,” IEEE Trans. Cybern., vol. 49, no. 3, pp. 1110–1122, Mar. 2019.
[43]
J. A. Suykens and J. Vandewalle, “Least squares support vector machine classifiers,” Neural Process. Lett., vol. 9, no. 3, pp. 293–300, 1999.
[44]
Y. Li, W. Zheng, Z. Cui, T. Zhang, and Y. Zong, “A novel neural network model based on cerebral hemispheric asymmetry for EEG emotion recognition,” in Proc. Int. Joint Conf. Artif. Intell., 2018, pp. 1561–1567.
[45]
P. Zhong, D. Wang, and C. Miao, “EEG-based emotion recognition using regularized graph neural networks,” IEEE Trans. Affect. Comput., vol. 13, no. 3, pp. 1290–1301, Third Quarter 2022.
[46]
F. Shen, G. Dai, G. Lin, J. Zhang, W. Kong, and H. Zeng, “EEG-based emotion recognition using 4D convolutional recurrent neural network,” Cogn. Neurodynamics, vol. 14, no. 6, pp. 815–828, 2020.
[47]
R. R. Selvaraju, M. Cogswell, A. Das, R. Vedantam, D. Parikh, and D. Batra, “Grad-CAM: Visual explanations from deep networks via gradient-based localization,” in Proc. IEEE Int. Conf. Comput. Vis., 2017, pp. 618–626.

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image IEEE Transactions on Affective Computing
IEEE Transactions on Affective Computing  Volume 15, Issue 3
July-Sept. 2024
1087 pages

Publisher

IEEE Computer Society Press

Washington, DC, United States

Publication History

Published: 22 September 2023

Qualifiers

  • Research-article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 0
    Total Downloads
  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 21 Dec 2024

Other Metrics

Citations

View Options

View options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media