Nothing Special   »   [go: up one dir, main page]

skip to main content
research-article

TorchEEGEMO: : A deep learning toolbox towards EEG-based emotion recognition

Published: 17 July 2024 Publication History

Abstract

With deep learning (DL) development, EEG-based emotion recognition has attracted increasing attention. Diverse DL algorithms emerge and intelligently decode human emotion from EEG signals. However, the lack of a toolbox encapsulating these techniques hampers further the design, development, testing, implementation, and management of intelligent systems. To tackle this bottleneck, we propose a Python toolbox, TorchEEGEMO, which divides the workflow into five modules: datasets, transforms, model_selection, models, and trainers. Each module includes plug-and-play functions to construct and manage a stage in the workflow. Recognizing the frequent access to time windows of interest, we introduce a window-centric parallel input/output system, bolstering the efficiency of DL systems. We finally conduct extensive experiments to provide the benchmark results of supported modules. Our extensive experimental results demonstrate the versatility and applicability of TorchEEGEMO across various scenarios.

Highlights

The first deep learning toolbox towards EEG-based emotion recognition.
A workflow that divides the recognition system into five plug-and-play modules.
Built-in functions cover datasets, transformations, models, algorithms, and more.
A novel window-centric EEG I/O is to enhance system effectiveness.
Experiments demonstrate benchmark performance across various scenarios.

References

[1]
Abadi, M., Barham, P., Chen, J., Chen, Z., Davis, A., Dean, J., et al. (2016). TensorFlow: A system for large-scale machine learning. In USENIX symposium on operating systems design and implementation (pp. 265–283).
[2]
Arjun, A., Rajpoot, A. S., & Panicker, M. R. (2021). Introducing attention mechanism for EEG signals: Emotion recognition with vision transformers. In International conference of the IEEE engineering in medicine & biology society (pp. 5723–5726).
[3]
Ayata D., Yaslan Y., Kamasak M.E., Emotion based music recommendation system using wearable physiological sensors, Transactions on Consumer Electronics 64 (2) (2018) 196–203.
[4]
Bao F.S., Liu X., Zhang C., PyEEG: an open source python module for EEG/MEG feature extraction, Computational Intelligence and Neuroscience 2011 (2011).
[5]
Beyer L., Zhai X., Kolesnikov A., Better plain ViT baselines for ImageNet-1k, 2022, arXiv preprint arXiv:2205.01580.
[6]
Cao H., Tan C., Gao Z., Chen G., Heng P.-A., Li S.Z., A survey on generative diffusion model, 2022, arXiv preprint arXiv:2209.02646.
[7]
Dadebayev D., Goh W.W., Tan E.X., EEG-based emotion recognition: Review of commercial EEG devices and machine learning techniques, Journal of King Saud University-Computer and Information Sciences 34 (7) (2022) 4385–4401.
[8]
Daimi S.N., Saha G., Classification of emotions induced by music videos and correlation with participants’ rating, Expert Systems with Applications 41 (13) (2014) 6057–6065.
[9]
Dalal S.S., Zumer J.M., Guggisberg A.G., Trumpis M., Wong D.D., Sekihara K., et al., MEG/EEG source reconstruction, statistical evaluation, and visualization with NUTMEG, Computational Intelligence and Neuroscience 2011 (2011).
[10]
Delorme A., Makeig S., EEGLAB: an open source toolbox for analysis of single-trial EEG dynamics including independent component analysis, Journal of Neuroscience 134 (2004) 9–21.
[11]
Delorme A., Mullen T., Kothe C., Akalin Acar Z., Bigdely-Shamlo N., Vankov A., et al., EEGLAB, SIFT, NFT, BCILAB, and ERICA: New tools for advanced EEG processing, Computational Intelligence and Neuroscience 2011 (2011).
[12]
Ding Y., Robinson N., Zeng Q., Guan C., LGGNet: learning from Local-global-graph representations for brain-computer interface, 2021, arXiv preprint arXiv:2105.02786.
[13]
Ding Y., Robinson N., Zhang S., Zeng Q., Guan C., TSception: Capturing temporal dynamics and spatial asymmetry from EEG for emotion recognition, Transactions on Affective Computing (2022).
[14]
Dosovitskiy A., Beyer L., Kolesnikov A., Weissenborn D., Zhai X., Unterthiner T., et al., An image is worth 16x16 words: Transformers for image recognition at scale, 2020, arXiv preprint arXiv:2010.11929.
[15]
Feng L., Cheng C., Zhao M., Deng H., Zhang Y., EEG-based emotion recognition using spatial-temporal graph convolutional LSTM with attention mechanism, IEEE Journal of Biomedical and Health Informatics 26 (11) (2022) 5406–5417.
[16]
Ganin Y., Ustinova E., Ajakan H., Germain P., Larochelle H., Laviolette F., et al., Domain-adversarial training of neural networks, Journal of Machine Learning Research 17 (59) (2016) 1–35.
[17]
García-Martínez B., Martinez-Rodrigo A., Alcaraz R., Fernández-Caballero A., A review on nonlinear methods using electroencephalographic recordings for emotion recognition, IEEE Transactions on Affective Computing 12 (3) (2019) 801–820.
[18]
Ghosh S.M., Bandyopadhyay S., Mitra D., Nonlinear classification of emotion from EEG signal based on maximized mutual information, Expert Systems with Applications 185 (2021).
[19]
Goodfellow I., Pouget-Abadie J., Mirza M., Xu B., Warde-Farley D., Ozair S., et al., Generative adversarial networks, Communications of the ACM 63 (11) (2020) 139–144.
[20]
Gramfort A., Luessi M., Larson E., Engemann D.A., Strohmeier D., Brodbeck C., et al., MEG and EEG data analysis with MNE-Python, Frontiers in Neuroscience (2013) 267.
[21]
Gretton A., Borgwardt K., Rasch M., Schölkopf B., Smola A., A kernel method for the two-sample-problem, Advances in Neural Information Processing Systems 19 (2006).
[22]
Gulrajani I., Ahmed F., Arjovsky M., Dumoulin V., Courville A.C., Improved training of wasserstein GANs, Advances in Neural Information Processing Systems 30 (2017).
[23]
Haeusser, P., Frerix, T., Mordvintsev, A., & Cremers, D. (2017). Associative domain adaptation. In International conference on computer vision (pp. 2765–2773).
[24]
He C., Liu J., Zhu Y., Du W., Data augmentation for deep neural networks model in EEG classification task: a review, Frontiers in Human Neuroscience 15 (2021).
[25]
Heusel M., Ramsauer H., Unterthiner T., Nessler B., Hochreiter S., GANs trained by a two time-scale update rule converge to a local nash equilibrium, Advances in Neural Information Processing Systems 30 (2017).
[26]
Higgins, I., Matthey, L., Pal, A., Burgess, C., Glorot, X., Botvinick, M., et al. (2017). beta-VAE: Learning basic visual concepts with a constrained variational framework. In International conference on learning representations.
[27]
Ho J., Jain A., Abbeel P., Denoising diffusion probabilistic models, Advances in Neural Information Processing Systems 33 (2020) 6840–6851.
[28]
Jayaram V., Barachant A., MOABB: trustworthy algorithm benchmarking for BCIs, Journal of Neural Engineering 15 (6) (2018).
[29]
Jia, Y., Shelhamer, E., Donahue, J., Karayev, S., Long, J., Girshick, R., et al. (2014). Caffe: Convolutional architecture for fast feature embedding. In Proceedings of the ACM international conference on multimedia (pp. 675–678).
[30]
Jin L., Kim E.Y., Interpretable cross-subject EEG-based emotion recognition using channel-wise features, Sensors 20 (23) (2020) 6719.
[31]
Katsigiannis S., Ramzan N., DREAMER: A database for emotion recognition through EEG and ECG signals from wireless low-cost off-the-shelf devices, IEEE Journal of Biomedical and Health Informatics 22 (1) (2017) 98–107.
[32]
Kingma D.P., Dhariwal P., Glow: Generative flow with invertible 1x1 convolutions, Advances in Neural Information Processing Systems 31 (2018).
[33]
Kingma D.P., Welling M., Auto-encoding variational bayes, 2013, arXiv preprint arXiv:1312.6114.
[34]
Koelstra S., Muhl C., Soleymani M., Lee J.-S., Yazdani A., Ebrahimi T., et al., DEAP: A database for emotion analysis; using physiological signals, IEEE Transactions on Affective Computing 3 (1) (2011) 18–31.
[35]
Kumawat R., Jain M., EEG based emotion recognition and classification: A review, International Research Journal on Advanced Science Hub 3 (2021) 1–10.
[36]
Lawhern V.J., Solon A.J., Waytowich N.R., Gordon S.M., Hung C.P., Lance B.J., EEGNet: a compact convolutional neural network for EEG-based brain–computer interfaces, Journal of Neural Engineering 15 (5) (2018).
[37]
Li W., Huan W., Hou B., Tian Y., Zhang Z., Song A., Can emotion be transferred?—A review on transfer learning for EEG-based emotion recognition, IEEE Transactions on Cognitive and Developmental Systems 14 (3) (2021) 833–846.
[38]
Li J., Zhang Z., He H., Hierarchical convolutional neural networks for EEG-based emotion recognition, Cognitive Computation 10 (2018) 368–380.
[39]
Li X., Zhang Y., Tiwari P., Song D., Hu B., Yang M., et al., EEG based emotion recognition: A tutorial and review, Computing Surveys (2022).
[40]
Litvak V., Mattout J., Kiebel S., Phillips C., Henson R., Kilner J., et al., EEG and MEG data analysis in SPM8, Computational Intelligence and Neuroscience 2011 (2011).
[41]
Liu, Y., Zhou, Y., & Zhang, D. (2022). TcT: Temporal and channel Transformer for EEG-based Emotion Recognition. In International symposium on computer-based medical systems (pp. 366–371).
[42]
Long, M., Cao, Y., Wang, J., & Jordan, M. (2015). Learning transferable features with deep adaptation networks. In International conference on machine learning (pp. 97–105).
[43]
Marcel, S., & Rodriguez, Y. (2010). Torchvision the machine-vision package of torch. In International conference on multimedia (pp. 1485–1488).
[44]
Miranda-Correa J.A., Abadi M.K., Sebe N., Patras I., AMIGOS: A dataset for affect, personality and mood research on individuals and groups, IEEE Transactions on Affective Computing 12 (2) (2018) 479–493.
[45]
Mohammed S.N., Hassan A.K.A., A survey on emotion recognition for human robot interaction, Journal of Computing and Information Technology 28 (2) (2020) 125–146.
[46]
Nakisa B., Rastgoo M.N., Tjondronegoro D., Chandran V., Evolutionary computation algorithms for feature selection of EEG-based emotion recognition using mobile sensors, Expert Systems with Applications 93 (2018) 143–155.
[47]
Nayak S., Nagesh B., Routray A., Sarma M., A Human–Computer Interaction framework for emotion recognition through time-series thermal video sequences, Computers & Electrical Engineering 93 (2021).
[48]
Oostenveld R., Fries P., Maris E., Schoffelen J., FieldTrip: open source software for advanced analysis of MEG, EEG, and invasive electrophysiological data, Computational Intelligence and Neuroscience 2011 (2011) 9.
[49]
Ouyang D., Yuan Y., Li G., Guo Z., The effect of time window length on EEG-based emotion recognition, Sensors 22 (13) (2022) 4939.
[50]
Papamakarios G., Nalisnick E.T., Rezende D.J., Mohamed S., Lakshminarayanan B., Normalizing flows for probabilistic modeling and inference, Journal of Machine Learning Research 22 (57) (2021) 1–64.
[51]
Paszke A., Gross S., Massa F., Lerer A., Bradbury J., Chanan G., et al., Pytorch: An imperative style, high-performance deep learning library, Advances in Neural Information Processing Systems 32 (2019).
[52]
Rahman M.M., Sarkar A.K., Hossain M.A., Moni M.A., EEG-based emotion analysis using non-linear features and ensemble learning approaches, Expert Systems with Applications 207 (2022).
[53]
Sakalle A., Tomar P., Bhardwaj H., Acharya D., Bhardwaj A., A LSTM based deep learning network for recognizing emotions using wireless brainwave driven system, Expert Systems with Applications 173 (2021).
[54]
Salimans T., Goodfellow I., Zaremba W., Cheung V., Radford A., Chen X., Improved techniques for training GANs, Advances in Neural Information Processing Systems 29 (2016).
[55]
Sawata, R., Ogawa, T., & Haseyama, M. (2021). Human-centered favorite music classification using EEG-based individual music preference via deep time-series CCA. In International conference on acoustics, speech and signal processing (pp. 1320–1324).
[56]
Schirrmeister R.T., Springenberg J.T., Fiederer L.D.J., Glasstetter M., Eggensperger K., Tangermann M., et al., Deep learning with convolutional neural networks for EEG decoding and visualization, Human Brain Mapping 38 (11) (2017) 5391–5420.
[57]
Seide, F., & Agarwal, A. (2016). CNTK: Microsoft’s open-source deep-learning toolkit. In Proceedings of the ACM international conference on knowledge discovery and data mining (pp. 2135–2135).
[58]
Shen, G., & Liu, Q. (2020). Performance analysis of linear regression based on Python. In International cognitive cities conference (pp. 695–702).
[59]
Soleymani M., Lichtenauer J., Pun T., Pantic M., A multimodal database for affect recognition and implicit tagging, IEEE Transactions on Affective Computing 3 (1) (2011) 42–55.
[60]
Song T., Zheng W., Lu C., Zong Y., Zhang X., Cui Z., MPED: A multi-modal physiological emotion database for discrete emotion recognition, Access 7 (2019) 12177–12191.
[61]
Song T., Zheng W., Song P., Cui Z., EEG emotion recognition using dynamical graph convolutional neural networks, IEEE Transactions on Affective Computing 11 (3) (2018) 532–541.
[62]
Subasi A., EEG signal classification using wavelet feature extraction and a mixture of expert model, Expert Systems with Applications 32 (4) (2007) 1084–1093.
[63]
Subha D.P., Joseph P.K., Acharya U R., Lim C.M., et al., EEG signal analysis: a survey, Journal of Medical Systems 34 (2) (2010) 195–212.
[64]
Sun B., Saenko K., Deep CORAL: Correlation alignment for deep domain adaptation, in: European conference on computer vision, Springer, 2016, pp. 443–450.
[65]
Tadel F., Baillet S., Mosher J., Pantazis D., Leahy R., Brainstorm: a user-friendly application for MEG/EEG analysis, Computational Intelligence and Neuroscience 8 (2011).
[66]
Tay Y., Dehghani M., Bahri D., Metzler D., Efficient transformers: A survey, ACM Computing Surveys 55 (6) (2022).
[67]
Team T.T.D., Al-Rfou R., Alain G., Almahairi A., Angermueller C., Bahdanau D., et al., Theano: A Python framework for fast computation of mathematical expressions, 2016, arXiv preprint arXiv:1605.02688.
[68]
Tzeng E., Hoffman J., Zhang N., Saenko K., Darrell T., Deep domain confusion: Maximizing for domain invariance, 2014, arXiv preprint arXiv:1412.3474.
[69]
Vaswani A., Shazeer N., Parmar N., Uszkoreit J., Jones L., Gomez A.N., et al., Attention is all you need, Advances in Neural Information Processing Systems 30 (2017).
[70]
Wang Y., Liu J., Ruan Q., Wang S., Wang C., Cross-subject EEG emotion classification based on few-label adversarial domain adaption, Expert Systems with Applications 185 (2021).
[71]
Wang J., Wang M., Review of the emotional feature extraction and classification using EEG signals, Cognitive Robotics 1 (2021) 29–40.
[72]
Wang Z., Wang Y., Hu C., Yin Z., Song Y., Transformers for EEG-based emotion recognition: A hierarchical spatial information learning model, IEEE Sensors Journal 22 (5) (2022) 4359–4368.
[73]
Yang, Y., Wu, Q., Fu, Y., & Chen, X. (2018). Continuous convolutional neural network with 3D input for EEG-based emotion recognition. In International conference on neural information processing (pp. 433–443).
[74]
Yin Z., Liu L., Chen J., Zhao B., Wang Y., Locally robust EEG feature selection for individual-independent emotion recognition, Expert Systems with Applications 162 (2020).
[75]
Yu X., Li Z., Zang Z., Liu Y., Real-time EEG-based emotion recognition, Sensors 23 (18) (2023) 7853.
[76]
Yu C., Wang M., Survey of emotion recognition methods using EEG information, Cognitive Robotics 2 (2022) 132–146.
[77]
Zhang X., Yao L., Deep learning for EEG-based brain–computer interfaces: Representations, algorithms and applications, 2021.
[78]
Zhang Z., Zhong S.-h., Liu Y., GANSER: A self-supervised data augmentation framework for EEG-based emotion recognition, IEEE Transactions on Affective Computing (2022).
[79]
Zheng W.-L., Lu B.-L., Investigating critical frequency bands and channels for EEG-based emotion recognition with deep neural networks, IEEE Transactions on Autonomous Mental Development 7 (3) (2015) 162–175.
[80]
Zhong P., Wang D., Miao C., EEG-based emotion recognition using regularized graph neural networks, IEEE Transactions on Affective Computing (2020).

Cited By

View all
  • (2024)Are We in the Zone? Exploring the Features and Method of Detecting Simultaneous Flow Experiences Based on EEG SignalsProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36997748:4(1-42)Online publication date: 21-Nov-2024

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image Expert Systems with Applications: An International Journal
Expert Systems with Applications: An International Journal  Volume 249, Issue PB
Sep 2024
1582 pages

Publisher

Pergamon Press, Inc.

United States

Publication History

Published: 17 July 2024

Author Tags

  1. Electroencephalography (EEG)
  2. Emotion recognition
  3. Deep learning toolbox

Qualifiers

  • Research-article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 19 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Are We in the Zone? Exploring the Features and Method of Detecting Simultaneous Flow Experiences Based on EEG SignalsProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36997748:4(1-42)Online publication date: 21-Nov-2024

View Options

View options

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media