Nothing Special   »   [go: up one dir, main page]

skip to main content
research-article

Bridge Graph Attention Based Graph Convolution Network With Multi-Scale Transformer for EEG Emotion Recognition

Published: 30 April 2024 Publication History

Abstract

In multichannel electroencephalograph (EEG) emotion recognition, most graph-based studies employ shallow graph model for spatial characteristics learning due to node over-smoothing caused by an increase in network depth. To address over-smoothing, we propose the bridge graph attention-based graph convolution network (BGAGCN). It bridges previous graph convolution layers to attention coefficients of the final layer by adaptively combining each graph convolution output based on the graph attention network, thereby enhancing feature distinctiveness. Considering that graph-based network primarily focus on local EEG channel relationships, we introduce a transformer for global dependency. Inspired by the neuroscience finding that neural activities of different timescales reflect distinct spatial connectivities, we modify the transformer to a multi-scale transformer (MT) by applying multi-head attention to multichannel EEG signals after 1D convolutions at different scales. MT learns spatial features more elaborately to enhance feature representation ability. By combining BGAGCN and MT, our model BGAGCN-MT achieves state-of-the-art accuracy under subject-dependent and subject-independent protocols across three benchmark EEG emotion datasets (SEED, SEED-IV and DREAMER). Notably, our model effectively addresses over-smoothing in graph neural networks and provides an efficient solution to learning spatial relationships of EEG features at different scales.

References

[1]
Y. Wang et al., “A systematic review on affective computing: Emotion models, databases, and recent advances,” Inf. Fusion, vol. 83/84, pp. 19–52, 2022.
[2]
Y. Li, G. Lu, J. Li, Z. Zhang, and D. Zhang, “Facial expression recognition in the wild using multi-level features and attention mechanisms,” IEEE Trans. Affect. Comput., vol. 14, no. 1, pp. 451–462, Jan.-Mar. 2023.
[3]
Y. Li, Z. Zhang, B. Chen, G. Lu, and D. Zhang, “Deep margin-sensitive representation learning for cross-domain facial expression recognition,” IEEE Trans. Multimedia, vol. 25, pp. 1359–1373, 2023.
[4]
Y. Li, J. Huang, S. Lu, Z. Zhang, and G. Lu, “Cross-domain facial expression recognition via contrastive warm up and complexity-aware self-training,” IEEE Trans. Image. Process., vol. 32, pp. 5438–5450, 2023.
[5]
A. Kleinsmith and N. Bianchi-Berthouze, “Affective body expression perception and recognition: A survey,” IEEE Trans. Affect. Comput., vol. 4, no. 1, pp. 15–33, Jan.-Mar. 2013.
[6]
R. Harper and J. Southern, “A Bayesian deep learning framework for end-to-end prediction of emotion from heartbeat,” IEEE Trans. Affect. Comput., vol. 13, no. 2, pp. 985–991, Apr.-Jun. 2022.
[7]
X. Zhang et al., “Fusing of electroencephalogram and eye movement with group sparse canonical correlation analysis for anxiety detection,” IEEE Trans. Affect. Comput., vol. 13, no. 2, pp. 958–971, Apr.-Jun. 2022.
[8]
J. Shukla, M. Barreda-Angeles, J. Oliver, G. C. Nandi, and D. Puig, “Feature extraction and selection for emotion recognition from electrodermal activity,” IEEE Trans. Affect. Comput., vol. 12, no. 4, pp. 857–869, Oct.-Dec. 2021.
[9]
X. Li et al., “EEG based emotion recognition: A tutorial and review,” ACM Comput. Surv., vol. 55, no. 4, pp. 1–57, 2022.
[10]
R. Oostenvelda and P. Praamstrac, “The five percent electrode system for high-resolution EEG and ERP measurements,” Clin. Neurophysiol., vol. 112, pp. 713–719, 2001.
[11]
B. Garcia-Martinez, A. Martinez-Rodrigo, R. Alcaraz, and A. Fernandez-Caballero, “A review on nonlinear methods using electroencephalographic recordings for emotion recognition,” IEEE Trans. Affect. Comput., vol. 12, no. 3, pp. 801–820, Jul.-Sep. 2021.
[12]
R. Duan, J. Zhu, and B. Lu, “Differential entropy feature for EEG-based emotion classification,” in Proc. 6th Int. IEEE/EMBS Conf. Neural Eng., 2013, pp. 81–84.
[13]
W. Zheng and B. Lu, “Investigating critical frequency bands and channels for EEG-based emotion recognition with deep neural networks,” IEEE Trans. Auton. Ment. Develop., vol. 7, no. 3, pp. 162–175, Sep. 2015.
[14]
S. M. Alarcao and M. J. Fonseca, “Emotions recognition using EEG signals: A survey,” IEEE Trans. Affect. Comput., vol. 10, no. 3, pp. 374–393, Jul.–Sep. 2019.
[15]
Y. Li et al., “A novel bi-hemispheric discrepancy model for EEG emotion recognition,” IEEE Trans. Cogn. Develop. Syst., vol. 13, no. 2, pp. 354–367, Jun. 2021.
[16]
Y. Li, W. Zheng, Y. Zong, Z. Cui, T. Zhang, and X. Zhou, “A Bi-hemisphere domain adversarial neural network model for EEG emotion recognition,” IEEE Trans. Affect. Comput., vol. 12, no. 2, pp. 494–504, Apr.-Jun. 2021.
[17]
H. Cui, A. Liu, X. Zhang, X. Chen, K. Wang, and X. Chen, “EEG-based emotion recognition using an end-to-end regional-asymmetric convolutional neural network,” Knowl. Based Syst., vol. 205, 2020, Art. no.
[18]
J. Li, Z. Zhang, and H. He, “Hierarchical convolutional neural networks for EEG-based emotion recognition,” Cogn. Comput., vol. 10, no. 2, pp. 368–380, 2017.
[19]
F. Shen, G. Dai, G. Lin, J. Zhang, W. Kong, and H. Zeng, “EEG-based emotion recognition using 4D convolutional recurrent neural network,” Cogn. Neurodynamics, vol. 14, no. 6, pp. 815–828, 2020.
[20]
Z. Yin, M. Zhao, Y. Wang, J. Yang, and J. Zhang, “Recognition of emotions using multimodal physiological signals and an ensemble deep learning model,” Comput. Methods Programs Biomed., vol. 140, pp. 93–110, pp. 2017.
[21]
P. Zhong, D. Wang, and C. Miao, “EEG-based emotion recognition using regularized graph neural networks,” IEEE Trans. Affect. Comput., vol. 13, no. 3, pp. 1290–1301, Jul.-Sep. 2022.
[22]
M. Ye, C. L. P. Chen, and T. Zhang, “Hierarchical dynamic graph convolutional network with interpretability for EEG-based emotion recognition,” IEEE Trans. Neural Netw. Learn. Syst., early access, Dec. 9, 2022.
[23]
L. Yang, Z. Wenming, W. Lei, Z. Yuan, and C. Zhen, “From regional to global brain: A novel hierarchical spatial-temporal neural network model for EEG emotion recognition,” IEEE Trans. Affect. Comput., vol. 13, no. 2, pp. 568–578, Apr.-Jun. 2022.
[24]
R. J. Davidson, H. Abercrombie, J. B. Nitschke, and K. Putnam, “Regional brain function, emotion and disorders of emotion,” Curr. Opin. Neurobiol., vol. 9, no. 2, pp. 228–234, 1999.
[25]
G. Li, M. Muller, A. Thabet, and B. Ghanem, “DeepGCNs: Can GCNs go as deep as CNNs?,” in Proc. IEEE 17th Int. Conf. Comput. Vis., 2019, pp. 9267–9276.
[26]
T. Song, W. Zheng, P. Song, and Z. Cui, “EEG emotion recognition using dynamical graph convolutional neural networks,” IEEE Trans. Affect. Comput., vol. 11, no. 3, pp. 532–541, Jul.-Sep. 2020.
[27]
Y. Li et al., “GMSS: Graph-based multi-task self-supervised learning for EEG emotion recognition,” IEEE Trans. Affect. Comput., vol. 14, no. 3, pp. 2512–2525, Jul.-Sep. 2023.
[28]
M. Li, M. Qiu, W. Kong, L. Zhu, and Y. Ding, “Fusion graph representation of EEG for emotion recognition,” Sensors (Basel), vol. 23, no. 3, 2023, Art. no.
[29]
T. Chen, Y. Guo, S. Hao, and R. Hong, “Exploring self-attention graph pooling with EEG-based topological structure and soft label for depression detection,” IEEE Trans. Affect. Comput., vol. 28, no. 4, pp. 2016–2118, Oct.-Dec. 2022.
[30]
S. Asadzadeh, T. Y. Rezaii, S. Beheshti, and S. Meshgini, “Accurate emotion recognition utilizing extracted EEG sources as graph neural network nodes,” Cogn. Comput., vol. 15, pp. 176–189, 2023.
[31]
T. Song et al., “Variational instance-adaptive graph for EEG emotion recognition,” IEEE Trans. Affect. Comput., vol. 14, no. 1, pp. 343–356, Jan.-Mar. 2023.
[32]
R. Jenke, A. Peer, and M. Buss, “Feature extraction and selection for emotion recognition from EEG,” IEEE Trans. Affect. Comput., vol. 5, no. 3, pp. 327–339, Jul.-Sep. 2014.
[33]
Q. Li, Z. Han, and X.-M. Wu, “Deeper insights into graph convolutional networks for semi-supervised learning,” in Proc. 32nd AAAI Conf. Artif. Intell., vol. 32, 2018, pp. 3538–3545.
[34]
T. Zhang, X. Wang, X. Xu, and C. L. P. Chen, “GCB-Net: Graph convolutional broad network and its application in emotion recognition,” IEEE Trans. Affect. Comput., vol. 13, no. 1, pp. 379–388, Jan.-Mar. 2022.
[35]
X. Lin, J. Chen, W. Ma, W. Tang, and Y. Wang, “EEG emotion recognition using improved graph neural network with channel selection,” Comput. Methods Programs Biomed., vol. 231, 2023, Art. no.
[36]
W. L. Hamilton, R. Ying, and J. Leskovec, “Inductive representation learning on large graphs,” in Proc. 30th Conf. Neural Inf. Process. Syst., 2017, pp. 1025–1035.
[37]
Y. Rong, W. Huang, T. Xu, and J. Huang, “DropEdge: Towards deep graph convolutional networks on node classification,” in Proc. 6th Int. Conf. Learn. Representations, 2020, pp. 1–11.
[38]
T. Liu, A. Jiang, J. Zhou, M. Li, and H. K. Kwan, “GraphSAGE-based dynamic spatial–temporal graph convolutional network for traffic prediction,” IEEE Trans. Intell. Transp. Syst., vol. 24, no. 10, pp. 11210–11224, Oct. 2023.
[39]
Y. Liu, Y. Deng, J. Su, R. Wang, and C. Li, “Multiple input branches shift graph convolutional network with dropedge for skeleton-based action recognition,” in Proc. 21st Int. Conf. Image Anal. Process., 2022, pp. 584–596.
[40]
C. Duong, L. Zhang, and C.-T. Lu, “HateNet: A graph convolutional network approach to hate speech detection,” in Proc. Int. Conf. Big Data Smart Comput., 2022, pp. 5698–5707.
[41]
Y. Zhao, J. Chen, Z. Zhang, and R. Zhang, “BA-Net: Bridge attention for deep convolutional neural networks,” in Proc. Eur. Conf. Comput. Vis., 2022, pp. 297–312.
[42]
J. D. Semedo et al., “Feedforward and feedback interactions between visual cortical areas use different population activity patterns,” Nat. Commun., vol. 13, no. 1, 2022, Art. no.
[43]
A. Vaswani et al., “Attention is all you need,” in Proc. 31st Conf. Neural Inf. Process. Syst., 2017, pp. 6000–6010.
[44]
Y. Ma, Y. Song, and F. Gao, “A novel hybrid CNN-transformer model for EEG motor imagery classification,” in Proc. Int. J. Conf. Neural Netw., 2022, pp. 1–8.
[45]
L. Gong, M. Li, T. Zhang, and W. Chen, “EEG emotion recognition using attention-based convolutional transformer neural network,” Biomed. Signal Process. Control, vol. 84, 2023, Art. no.
[46]
M. Sun, W. Cui, S. Yu, H. Han, B. Hu, and Y. Li, “A dual-branch dynamic graph convolution based adaptive transformer feature fusion network for EEG emotion recognition,” IEEE Trans. Affect. Comput., vol. 13, no. 4, pp. 2218–2228, Oct.-Dec. 2022.
[47]
Z. Wang, Y. Wang, C. Hu, Z. Yin, and Y. Song, “Transformers for EEG-based emotion recognition: A hierarchical spatial information learning model,” IEEE Sens. J., vol. 22, no. 5, pp. 4359–4368, Mar. 2022.
[48]
S. Li and X.-J. Wang, “Hierarchical timescales in the neocortex: Mathematical mechanism and biological insights,” Proc. Nat. Acad. Sci. USA, vol. 119, no. 6, 2022, Art. no.
[49]
H. Cui, A. Liu, X. Zhang, X. Chen, J. Liu, and X. Chen, “EEG-based subject-independent emotion recognition using gated recurrent unit and minimum class confusion,” IEEE Trans. Affect. Comput., vol. 14, no. 4, pp. 2740–2750, Oct.-Dec. 2023.
[50]
W. Zheng, J. Zhu, and B. Lu, “Identifying stable patterns over time for emotion recognition from EEG,” IEEE Trans. Affect. Comput., vol. 10, no. 3, pp. 417–429, Jul.-Sep. 2019.
[51]
T. Zhang, W. Zheng, Z. Cui, Y. Zong, and Y. Li, “Spatial-temporal recurrent neural network for emotion recognition,” IEEE Trans. Cybern., vol. 49, no. 3, pp. 839–847, Mar. 2019.
[52]
J. Yu, H. Yin, J. Li, M. Gao, Z. Huang, and L. Cui, “Enhancing social recommendation with adversarial graph convolutional networks,” IEEE Trans. Knowl. Data. Eng., vol. 34, no. 8, pp. 3727–3739, Aug. 2022.
[53]
M. Zitnik and J. Leskovec, “Predicting multicellular function through multi-layer tissue networks,” Bioinformatics, vol. 33, no. 14, pp. i190–i198, 2017.
[54]
M. Rubinov and O. Sporns, “Complex network measures of brain connectivity: Uses and interpretations,” Neuroimage, vol. 52, no. 3, pp. 1059–1069, 2010.
[55]
G. Du et al., “A multi-dimensional graph convolution network for EEG emotion recognition,” IEEE Trans. Instrum. Meas., vol. 71, 2022, Art. no.
[56]
A. Dosovitskiy, L. Beyer, A. Kolesnikov, D. Weissenborn, and X. Zhai, “An image is worth 16x16 words: Transformers for image recognition at scale,” in Proc. 8th Int. Conf. Learn. Representations, 2020, pp. 1–22.
[57]
N. Carion, F. Massa, G. Synnaeve, N. Usunier, A. Kirillov, and S. Zagoruyko, “End-to-end object detection with transformers,” in Proc. Eur. Conf. Comput. Vis., 2020, pp. 213–229.
[58]
M. Defferrard, X. Bresson, and P. Vandergheynst, “Convolutional neural networks on graphs with fast localized spectral filtering,” in Proc. 30th Conf. Neural Inf. Process. Syst., 2017, pp. 3844–3852.
[59]
P. Velickovic, G. Cucurull, A. Casanova, A. Romero, P. Lio, and Y. Bengio, “Graph attention networks,” in Proc. 6th Int. Conf. Learn. Representations, 2018, pp. 1–12.
[60]
W. L. Zheng, W. Liu, Y. Lu, B. L. Lu, and A. Cichocki, “EmotionMeter: A multimodal framework for recognizing human emotions,” IEEE Trans. Cybern., vol. 49, no. 3, pp. 1110–1122, Mar. 2019.
[61]
S. Katsigiannis and N. Ramzan, “DREAMER: A database for emotion recognition through EEG and ECG signals from wireless low-cost off-the-shelf devices,” IEEE J. Biomed. Health Inform., vol. 22, no. 1, pp. 98–107, Jan. 2018.
[62]
D. Pan et al., “MSFR-GCN: A multi-scale feature reconstruction graph convolutional network for eeg emotion and cognition recognition,” IEEE Trans. Neural Syst. Rehabil. Eng., vol. 31, pp. 3245–3254, 2023.
[63]
S. J. Pan, I. W. Tsang, J. T. Kwok, and Q. Yang, “Domain adaptation via transfer component analysis,” IEEE Trans. Neural Netw., vol. 22, no. 2, pp. 199–210, Feb. 2011.
[64]
B. Fernando, A. Habrard, M. Sebban, and T. Tuytelaars, “Unsupervised visual domain adaptation using subspace alignment,” in Proc. IEEE Int. Conf. Comput. Vis., 2013, pp. 2960–2967.
[65]
R. Zhou et al., “PR-PL: A novel prototypical representation based pairwise learning framework for emotion recognition using EEG signals,” IEEE Trans. Affect. Comput., early access, Jun. 23, 2023.
[66]
Z. Li et al., “Dynamic domain adaptation for class-aware cross-subject and cross-session EEG emotion recognition,” IEEE J. Biomed. Health Inform., vol. 26, no. 12, pp. 5964–5973, 2022.
[67]
W. Zheng and B. Lu, “Personalizing EEG-based affective models with transfer learning,” in Proc. 34th Int. Joint Conf. Artif. Intell., 2016, pp. 2732–2738.
[68]
K. Yang, L. Tong, J. Shu, N. Zhuang, B. Yan, and Y. Zeng, “High gamma band EEG closely related to emotion: Evidence from functional network,” Front. Hum. Neurosci., vol. 14, 2020, Art. no.
[69]
T. Song, W. Zheng, C. Lu, Y. Zong, X. Zhang, and Z. Cui, “MPED: A multi-modal physiological emotion database for discrete emotion recognition,” IEEE Access, vol. 7, pp. 12177–12191, 2019.
[70]
W. Guo, G. Xu, and Y. Wang, “Horizontal and vertical features fusion network based on different brain regions for emotion recognition,” Knowl. Based Syst., vol. 247, 2022, Art. no.
[71]
L. Zhu et al., “Multisource wasserstein adaptation coding network for EEG emotion recognition,” Biomed. Signal Process. Control, vol. 76, 2022, Art. no.
[72]
W. Zheng, “Multichannel EEG-based emotion recognition via group sparse canonical correlation analysis,” IEEE Trans. Cogn. Develop. Syst., vol. 9, no. 3, pp. 281–290, Sep. 2017.
[73]
L. Yang, Z. Wenming, C. Zhen, and Z. Xiaoyan, “A novel graph regularized sparse linear discriminant analysis model for EEG emotion recognition,” in Proc. 23 rd Int. Conf. Neural Inf. Process., 2016, pp. 175–182.
[74]
G. Wu, S. Lin, Y. Zhuang, and J. Qiao, “Alleviating over-smoothing via graph sparsification based on vertex feature similarity,” Appl. Intell., vol. 53, no. 17, pp. 20223–20238, 2023.
[75]
Y. Liu et al., “CurvDrop: A RICCI curvature based approach to prevent graph neural networks from over-smoothing and over-squashing,” in Proc. ACM Web Conf., 2023, pp. 221–230.
[76]
L. V. d. Maaten and G. Hinton, “Visualizing data using T-SNE,” J. Mach. Learn. Res., vol. 90, no. 11, pp. 2579–2605, 2008.

Index Terms

  1. Bridge Graph Attention Based Graph Convolution Network With Multi-Scale Transformer for EEG Emotion Recognition
        Index terms have been assigned to the content through auto-classification.

        Recommendations

        Comments

        Please enable JavaScript to view thecomments powered by Disqus.

        Information & Contributors

        Information

        Published In

        cover image IEEE Transactions on Affective Computing
        IEEE Transactions on Affective Computing  Volume 15, Issue 4
        Oct.-Dec. 2024
        375 pages

        Publisher

        IEEE Computer Society Press

        Washington, DC, United States

        Publication History

        Published: 30 April 2024

        Qualifiers

        • Research-article

        Contributors

        Other Metrics

        Bibliometrics & Citations

        Bibliometrics

        Article Metrics

        • 0
          Total Citations
        • 0
          Total Downloads
        • Downloads (Last 12 months)0
        • Downloads (Last 6 weeks)0
        Reflects downloads up to 01 Dec 2024

        Other Metrics

        Citations

        View Options

        View options

        Login options

        Media

        Figures

        Other

        Tables

        Share

        Share

        Share this Publication link

        Share on social media