Nothing Special   »   [go: up one dir, main page]

You seem to have javascript disabled. Please note that many of the page functionalities won't work as expected without javascript enabled.
 
 
Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (1)

Search Parameters:
Keywords = DE-CNN-BiLSTM

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
11 pages, 3691 KiB  
Article
A Novel DE-CNN-BiLSTM Multi-Fusion Model for EEG Emotion Recognition
by Fachang Cui, Ruqing Wang, Weiwei Ding, Yao Chen and Liya Huang
Mathematics 2022, 10(4), 582; https://doi.org/10.3390/math10040582 - 13 Feb 2022
Cited by 38 | Viewed by 4112
Abstract
As a long-standing research topic in the field of brain–computer interface, emotion recognition still suffers from low recognition accuracy. In this research, we present a novel model named DE-CNN-BiLSTM deeply integrating the complexity of EEG signals, the spatial structure of brain and temporal [...] Read more.
As a long-standing research topic in the field of brain–computer interface, emotion recognition still suffers from low recognition accuracy. In this research, we present a novel model named DE-CNN-BiLSTM deeply integrating the complexity of EEG signals, the spatial structure of brain and temporal contexts of emotion formation. Firstly, we extract the complexity properties of the EEG signal by calculating Differential Entropy in different time slices of different frequency bands to obtain 4D feature tensors according to brain location. Subsequently, the 4D tensors are input into the Convolutional Neural Network to learn brain structure and output time sequences; after that Bidirectional Long-Short Term Memory is used to learn past and future information of the time sequences. Compared with the existing emotion recognition models, the new model can decode the EEG signal deeply and extract key emotional features to improve accuracy. The simulation results show the algorithm achieves an average accuracy of 94% for DEAP dataset and 94.82% for SEED dataset, confirming its high accuracy and strong robustness. Full article
(This article belongs to the Special Issue From Brain Science to Artificial Intelligence)
Show Figures

Figure 1

Figure 1
<p>The framework of the proposed EEG-based emotion recognition model DE-CNN-BiLSTM.</p>
Full article ">Figure 2
<p>The spatial mapping of the DE features in four frequency bands.</p>
Full article ">Figure 3
<p>The spatial structure distribution of CNN model.</p>
Full article ">Figure 4
<p>The structure of the Bi-LSTM.</p>
Full article ">Figure 5
<p>(<b>a</b>)Topology changes in the DE at 10–60 s slices in four frequency bands at positive emotion. (<b>b</b>)Topology changes in the DE at 10–60 s slices in four frequency bands at negative emotion.</p>
Full article ">Figure 5 Cont.
<p>(<b>a</b>)Topology changes in the DE at 10–60 s slices in four frequency bands at positive emotion. (<b>b</b>)Topology changes in the DE at 10–60 s slices in four frequency bands at negative emotion.</p>
Full article ">Figure 6
<p>Training progress of the model in terms of training and validation accuracy for the emotional dimension of Valence.</p>
Full article ">Figure 7
<p>(<b>a</b>) Distribution of the emotion recognition accuracy of DEAP dataset on Valence and Arousal. (<b>b</b>) Distribution of the emotion recognition accuracy of SEED dataset.</p>
Full article ">
Back to TopTop