Nothing Special   »   [go: up one dir, main page]

Skip to main content

Applying nonlinear measures to the brain rhythms: an effective method for epilepsy diagnosis

Abstract

Background

Epilepsy is a neurological disorder from which almost 50 million people have been suffering. These statistics indicate the importance of epilepsy diagnosis. Electroencephalogram (EEG) signals analysis is one of the most common methods for epilepsy characterization; hence, various strategies were applied to classify epileptic EEGs.

Methods

In this paper, four different nonlinear features such as Fractal dimensions including Higuchi method (HFD) and Katz method (KFD), Hurst exponent, and L-Z complexity measure were extracted from EEGs and their frequency sub-bands. The features were ranked later by implementing Relieff algorithm. The ranked features were applied sequentially to three different classifiers (MLPNN, Linear SVM, and RBF SVM).

Results

According to the dataset used for this study, there are five classification problems named ABCD/E, AB/CD/E, A/D/E, A/E, and D/E. In all cases, MLPNN was the most accurate classifier. Its performances for mentioned classification problems were 99.91%, 98.19%, 98.5%, 100% and 99.84%, respectively.

Conclusion

The results demonstrate that KFD is the highest-ranking feature; In addition, beta and theta sub-bands are the most important frequency bands because, for all cases, the top features were KFDs extracted from beta and theta sub-bands. Moreover, high levels of accuracy have been obtained just by using these two features which reduce the complexity of the classification.

Peer Review reports

Introduction

The human brain is a complex system and displays temporally intricate dynamics. One way to observe the brain’s activity is Electroencephalography. In fact, EEG signals are the recording of the brain’s electrical activity and are used by clinicians in the diagnosis of neurological disorders [1]. Epilepsy is one of the most prevailing neural diseases and almost fifty million people worldwide suffer from it [2]. Abnormal electrical discharges of a group of neurons in the brain are the cause of seizures. Therefore, EEG signals have valuable information about this disorder. Detecting abnormality in EEGs is a critical issue in the diagnosis process. Since visual inspection is not a proper and reliable method to detect abnormality in EEGs, various methods are presented to extract important features. Different kinds of strategies were used to analyze EEG data. The prevalent techniques are temporal and spectral analysis, and nonlinear methods [3]. Altunay et al. have applied the linear prediction error energy method to find seizures in EEGs. They have asserted that this approach can be used as an index for epileptic seizures in EEG signals [3, 4]. Ghosh-Dastidar et al. have used Principal Component Analysis (PCA) to classify epileptic EEGs. They proposed a model that could achieve high accuracy (99.3%) [3, 5]. Acharya et al. have used PCA with different classifiers. They obtained 99% classification accuracy using the Gaussian Mixture Model (GMM) classifier [3, 6]. Subasi and Gursoy have implemented PCA, Independent Component Analysis, and Linear Discriminant Analysis for EEGs classification [3, 7]. Lekshmi et al. utilized PCA with wavelet transforms for EEG signal classification [8]. Sharma et al. used the wavelet-statistical features method to detect non-convulsive seizures [9]. Ocak has presented an approach based on wavelet transform to classify epileptic seizures in EEG [3, 10]. A deep learning-based method was applied by Hussein et al. in order to detect seizures [11]. Raghu and Sriraam proposed a method based on neighborhood component analysis for the classification of focal seizures [12]. Mutlu employed Hilbert vibration decomposition for epilepsy diagnosis [13]. Yuan et al. used Diffusion Distance and Bayesian Linear Discriminate Analysis for predicting the seizures [14].

Amongst a wide spectrum of methods used for signal analysis, nonlinear dynamics based techniques are of great importance and have prominent information about brain signals due to EEGs’ nonlinearity and complexity. Kannathal et al. claim that entropy estimators can differentiate between normal and abnormal EEG data with the proper level of accuracy [3, 15]. Chua et al. [16] and Acharya et al. [17] applied Higher-Order Spectral (HOS) parameters for epilepsy detection [3]. The multi-fractal analysis was implemented for seizure detection [18]. Li et al. employed Fractal spectral analysis for epilepsy diagnosis [19]. Geng et al. extracted some nonlinear features (Correlation Dimension (CD), Hurst Exponent (HE), and Approximate Entropy (ApEn)) from healthy and epileptic EEGs. They declared that CD and HE are helpful in explicating epileptic EEG and interictal EEG [20]. Guler et al. have presented an algorithm using Lyapunov exponents for classifying EEGs [21].

In this study, nonlinear measures such as KFD, HFD, Hurst Exponent, and Lempel–Ziv complexity have been applied to the EEGs and brain rhythms. Relieff algorithm was used to select the best features and the classification was performed by three different classifiers (MLPNN, linear SVM, and RBF SVM). The main goal of this study is to achieve high levels of accuracy in the epileptic EEG data classification by using a few features. Besides, the most informative EEG rhythms and nonlinear features and also the best classifier for detecting epilepsy in EEGs have been determined.

Data set

The data set used for this study is publicly available online in [22] and comprises five collections signified A–E, every category consists of 100 single-channel EEG segments. The duration and sampling frequency of each segment are 23.6 s and 173.61 Hz respectively. The segments of collection A and collection B have been recorded from 5 normal subjects using 10–20 electrode system, while they were awake and relaxed with eyes open (A) and eyes closed (B). Group C consists of five patients’ recordings in seizure-free intervals from the epileptogenic zone, and group D corresponds to the hippocampal formation of the opposite hemisphere. Collection E is composed of EEG signals with seizure activity. The typical example of each EEG set is depicted in Fig. 1. More information about the data set is available in [23].

Fig. 1
figure 1

Typical example of five collections (A, B, C, D, and E). The amplitude unit for all of them is µV

Methods

Classification problems

Considering these five collections, five different sub-problems including two 3-class and three binary classification problems were designed. These problems are of great practical significance and are frequently used in several research papers related to epileptic EEG classification.

  1. 1.

    ABCD/E

  2. 2.

    AB/CD/E

  3. 3.

    A/D/E

  4. 4.

    A/E

  5. 5.

    D/E

Brain rhythms

Since features will be extracted from four sub-bands, four different waves named delta rhythm (0.5–4 Hz), theta rhythm (4–8 Hz), alpha rhythm (8–14 Hz), and beta rhythm (14–30 Hz) were extracted from the original signal. Fourth-order Butterworth band-pass filters were used to extract desired frequency sub-bands. Figure 2 illustrates these waves for an EEG data sample from collection A.

Fig. 2
figure 2

Brain rhythms for EEG data from collection A (Delta, Theta, Alpha, and Beta waves). The amplitude unit for all of them is µV

Nonlinear features

Fractal dimension

Fractal Dimension (FD) is a nonlinear measure which is used to analyze time series or biomedical signals. Generally, the fractal is a geometric concept referring to a set of points that has self- similarity. Fractal shapes are complex and have a non-integer dimension. FD can be calculated based on time-domain analysis or phase space domain analysis. Since phase space-based methods are very slow and time-consuming [24], we tend to apply time-domain approaches. In this paper, the methods presented by Higuchi and Katz have been reviewed and applied to EEG signals.

Higuchi fractal dimension (HFD)

Consider T as a temporal signal

$${\text{T}} = {\text{T}}\left( {1} \right),{\text{ T}}\left( {2} \right),{\text{ T}}\left( {3} \right), \ldots ,{\text{T}}\left( {\text{Y}} \right)$$

afterwards, p novel temporal signals are defined as follows:

$$\begin{gathered} T_{p}^{f} :T(f),T(f + p),T(f + 2p), \ldots ,T\left( {f + \left[ {\frac{Y - f}{p}} \right].p} \right) \hfill \\ (f = 1,2, \ldots ,p) \hfill \\ \end{gathered}$$
(1)

f shows the value of the beginning moment and p represents time intervals.

The average length is computed for all temporal signals and called Af and the mean value of that (A(p)) is computed for p = 1,2,3,…, ps. ps is the saturation point. For the current paper, two different values are considered for ps.

$${\raise0.7ex\hbox{${A_{f} (p) = \left\{ {\left( {\sum\limits_{j = 1}^{{[\frac{Y - f}{p}]}} {|T(f + jp) - T(f + (j - 1).p)|} } \right)} \right.\left. {\frac{Y - 1}{{[\frac{Y - f}{p}].p}}} \right\}}$} \!\mathord{\left/ {\vphantom {{A_{f} (p) = \left\{ {\left( {\sum\limits_{j = 1}^{{[\frac{Y - f}{p}]}} {|T(f + jp) - T(f + (j - 1).p)|} } \right)} \right.\left. {\frac{Y - 1}{{[\frac{Y - f}{p}].p}}} \right\}} p}}\right.\kern-\nulldelimiterspace} \!\lower0.7ex\hbox{$p$}}$$
(2)
$$A(p) = \frac{1}{p} \times \sum\limits_{f = 1}^{p} {A_{f} (p)}$$
(3)

Since

$$A(p) \propto p^{ - r} ,$$
(4)

by plotting log (A(p)) vs log (1/p), HFD was derived by the gradient of the straight line fitting the points [25].

Katz fractal dimension (KFD)

Katz proposed a method with a normalized formula for the calculation of fractal dimension. According to the Katz approach, a curve’s fractal dimension will be obtained by:

$$K = \frac{{\log_{10} (U)}}{{\log_{10} (b)}}$$
(5)

U is the whole length of the curve obtained by the summation of the distances between consecutive points:

$$U = sum(dist(j,j + 1))$$
(6)

where

$${\text{dist}}\left( {{\text{j}},{\text{j}} + {1}} \right) = \sqrt {(x_{j + 1} - x_{j} )^{2} + (y_{j + 1} - y_{j} )^{2} }$$
(7)

and b indicates how far apart the first point and the farthest point of the curve are:

$${\text{b}} = {\text{maximum}}\left( {\sqrt {(x_{1} - x_{j} )^{2} + (y_{1} - y_{\begin{subarray}{l} j \\ \end{subarray} } )^{2} } } \right)$$
(8)

Since K calculation is dependent on the units, U and b were normalized by “m” which was the mean distance betwixt successive spots. Finally, K became: [26]

$$K = \frac{{\log_{10} (U/m)}}{{\log_{10} (b/m)}}\mathop = \limits^{n = U/m} \frac{{\log_{10} (n)}}{{\log_{10} \left( \frac{b}{U} \right) + \log_{10} (n)}}$$
(9)

Hurst exponent

Hurst exponent estimates the self-similarity of a temporal signal and the values of this measure vary in the range [0, 1]; indeed, it expresses the trendiness. H > 0.5 exhibits positive correlation, while H < 0.5 displays negative correlation; H = 0.5 displays un-correlated time series [27, 28]. This feature is estimated by rescaled rang technique:

$$E\left[ {\frac{R(m)}{{S(m)}}} \right] = Cm^{H} \mathop {}\nolimits_{{}}^{{}} m \to \infty$$
(10)

A temporal signal of total length M is rescaled to subsequences of length m = M, M/2, M/4, M/8, …. “E” represents the expected value. R (m) and S (m) are defined as the value range and SD respectively. F is the fixed parameter. The following lines describe the rescaling:

$$a = \frac{1}{m}\sum\limits_{j = 1}^{m} {T_{j} }$$
(11)
$$W_{k} = T_{k} - a\mathop {}\nolimits_{{}}^{{}} \mathop {}\nolimits_{{}}^{{}} \mathop {}\nolimits_{{}}^{{}} k = 1,2,...,m$$
(12)
$$V_{k} = \sum\limits_{j = 1}^{k} {W_{j} } \mathop {}\nolimits_{{}}^{{}} \mathop {}\nolimits_{{}}^{{}} \mathop {}\nolimits_{{}}^{{}} k = 1,2,...,m$$
(13)
$$\frac{R}{S}(m) = \frac{R(m)}{{S(m)}} = \frac{{{\text{maximum}} (V_{1,} V_{2} ,V_{3} ,...,V_{m} ) - minimum(V_{1,} V_{2} ,V_{3} ,...,V_{m} )}}{{\sqrt {\frac{1}{m}\sum\limits_{j = 1}^{m} {(T_{j} - a)} } }}$$
(14)

After calculating R/S for all segments, they have been averaged over segments (R/Save). For different values of n, different R/Save values will be obtained. The Hurst exponent can be estimated by the gradient of linear regression line wherein X and Y coordinates represent log (m) and log (R/Save (m)) respectively [29,30,31].

Lempel–Ziv (L–Z) complexity

A coarse-graining approach is the base of the L-Z complexity measure. To estimate the L-Z complexity, the temporal signal should be converted into a symbolic sequence. The binary sequence is a common choice for this purpose. This procedure is done by considering a margin for sequence values. Two different margin values (L), which were median and mean of the signal, have been used for the analysis; therefore, two different complexity values have been obtained. S is the Binary sequence of the series:

S = u(1),u(2), …,u(m).

u(j) is:

$$u(j) = \left\{ \begin{gathered} 0,\mathop {}\nolimits_{{}}^{{}} if_{{}}^{{}} T(j) < L \hfill \\ 1,\mathop {}\nolimits_{{}}^{{}} if_{{}}^{{}} T(j) > L \hfill \\ \end{gathered} \right.$$
(15)

The obtained binary series was scanned from left to right for both margin values. Complexity counter (c(m)) calculates the number of different substrings contained in the new sequence. As a novel string is detected, \(c(m) \to c(m) + 1\).

The normalized complexity (C(m)) is defined as below [32, 33]:

$$C(m) = \frac{c(m)}{{m/\log_{2} (m)}}$$
(16)

Supervised feature selection

In this part, several nonlinear features including two HFD (different kmaxs), KFD, two L-Z complexity measures (mean and median as their threshold), and Hurst exponent were extracted from not only the EEGs but also their different rhythms. Thus, the feature set has 30 members (\(\{\)6 nonlinear features\(\}\) × \(\{\)5 EEGs and their rhythms\(\}\)). Feature selection is one of the classification steps that is of significant importance. In order to select the best subset of features, Relieff feature selection has been used. This technique selects the most significant features based on their relevance and assigns weight to features for ranking them according to their weights. More information about this algorithm is available in [34, 35].

Classification

Three classifiers were used to perform classification on each problem: MLPNN, Linear SVM, and RBF SVM. Parameter tuning plays a major role in classifier accuracy. The classification was performed by Nested ten-fold cross-validation; one cross-validation loop for parameter setting and another loop for model selection.

Multi-layer perceptron neural networks (MLPNN)

An MLPNN is composed of multi-layers of computational nodes in a digraph, each layer fully connected to the next one. All nodes are considered as a neuron with an activation function except for the input nodes [36, 37]. Matlab software (R2016a) Neural Network toolbox was used for MLPNN classification. The number of neurons in the hidden layer was set by ten-fold cross-validation. The transfer function used in neural network architecture was hyperbolic tangent sigmoid.

Support vector machine (SVM)

Vapnik invented the SVM based on the principle of structural risk minimization. It is known as one of the most robust methods amongst the famous classification algorithms. SVM has been frequently used in various analyses such as regression, classification, and nonlinear function approximation. SVM is a binary classifier basically, but it was extended for multiclass problems by some methods [38]. We have used LIBSVM(version 3.20) tools to perform linear SVM and RBF SVM classifications [39, 40]. For more details on the SVM algorithms, please refer to papers [38, 39].

Results and discussion

The best feature is KFD due to its rank and repetition in the selected features. The order of appearance of features for the five top features is shown in Table 1.

Table1 Five top rank features for each classification problems

In Table 2, performances for three different classifiers, in both selected features and all features (without feature selection), have been compared. MLPNN has the best performance in all cases. In addition, the optimal number of features for the best classification accuracy was reported.

Table 2 Performance for each classifier in all classification problems with and without feature selection

The highest performances for three classifiers in each case are represented in Fig. 3. In Fig. 4, three classifiers are compared for case 1(ABCD / E). Although MLPNN had the best performance, when just the first selected feature was used, the accuracy of this classifier was significantly lower than other classifiers. However, when the second top feature was added to the first one, the performance was raised significantly, and finally, MLPNN reached 99.91% classification accuracy.

Fig. 3
figure 3

The highest performance of different classifiers (MLPNN, RBF SVM, and Linear SVM) for all classification problems. MLPNN classifier has the best classification accuracy in all cases

Fig. 4
figure 4

Performance versus Number of features for case1 (ABCD/E). Ranked features were applied subsequently to 3 different classifiers (MLPNN, Linear SVM, and RBF SVM). The best accuracy was obtained by the MLPNN classifier

Performances for MLPNN classifier are depicted in Fig. 5 for all cases. As Fig. 5 illustrates, case4 (A/E) had the best results because it could achieve the best accuracy (100%) with the lowest number of features (Optimal point (five features, 100%)). In other cases, a decent performance can be obtained by using only a few features.

Fig. 5
figure 5

Performance versus Number of features for all cases with MLPNN classifier

Our results indicate that nonlinear features can categorize different classification problems with high accuracy and KFD is a paramount measure to classify EEGs. This shows the supremacy of the fractal dimension (computed by using the Katz algorithm). Furthermore, analysis of brain rhythms demonstrates that theta and beta frequency bands are the most informative sub-bands. By combining KFD and these two frequency bands, in other words, KFDs extracted from theta and beta rhythms, a decent level of accuracy will be achieved (just by using two features). In this study, we have used different classifiers including MLPNN, Linear SVM, and RBF SVM in order to compare the performance of them. The results reveal that the MLPNN classifier has the best classification accuracy in all of the problems. As a comparison, some of the recent studies, which were done on the same data set and the same problems, and their results are mentioned in Table3.

Table 3 Accuracy of some relevant studies which have used the Bonn EEG dataset. Five classification problems were mentioned for comparison

As Table3 illustrates, the binary classification problems had better performances than the three-class problems. The approach presented in this article has better performance than the other studies. Furthermore, our approach is almost homogeneous and the accuracy didn’t vary significantly by changing from the two-class problems to the three-class problems.

Conclusion

There is some substantial information not only in seizure activity intervals of the patients’ EEGs but also in the seizure-free periods. This disorder can’t be diagnosed properly just by visual detection or simple measures. Therefore, various methods were used to classify normal and abnormal EEGs. In this study, five different classification problems were designed to classify Epileptic EEGs. These binary or three-class classification problems were used frequently in many relevant papers. We used nonlinear measures, including HFD, KFD, Hurst exponent, and L-Z complexity measure, which were applied to the original EEGs and their frequency sub-bands; then, the extracted features were ranked by Relieff algorithm. KFDs extracted from beta and theta sub-bands were the most informative features for all cases. The ranked features were applied to three different classifiers (MLPNN, Linear SVM, and RBF SVM) sequentially. Afterward, the optimum point, which had the highest accuracy with the least number of features, was detected in each problem for all classifiers. This study demonstrates that MLPNN had a better performance than the SVM classifiers; besides, the feature selection improved the accuracy of classification. The most significant advantage of this paper is that the high performances for all problems (binary and three-class cases) could be obtained just by using two features; hence, we think the proposed approach can be effectively implemented for epilepsy diagnosis thanks to its high level of accuracy with less complexity resulting of the low number of features.

Availability of data and materials

The data set used for this study is publicly available online in [22].

References

  1. Sanei S, Chambers JA. EEG signal processing. Wiley; 2013.

    Google Scholar 

  2. WHO Epilepsy. Available via World Health Organization. http://www.who.int/mediacentre/factsheets/fs999/en/

  3. Acharya UR, Sree SV, Swapna G, Martis RJ, Suri JS. Automated EEG analysis of epilepsy: a review. Knowl-Based Syst. 2013;45:147–65.

    Article  Google Scholar 

  4. Altunay S, Telatar Z, Erogul O. Epileptic EEG detection using the linear prediction error energy. Expert Syst Appl. 2010;37(8):5661–5.

    Article  Google Scholar 

  5. Ghosh-Dastidar S, Adeli H, Dadmehr N. Principal component analysis-enhanced cosine radial basis function neural network for robust epilepsy and seizure detection. IEEE Trans Biomed Eng. 2008;55(2):512–8.

    Article  Google Scholar 

  6. Acharya UR, Sree SV, Alvin APC, Suri JS. Use of principal component analysis for automatic classification of epileptic EEG activities in wavelet framework. Expert Syst Appl. 2012;39(10):9072–8.

    Article  Google Scholar 

  7. Subasi A, Gursoy MI. EEG signal classification using PCA, ICA, LDA and support vector machines. Expert Syst Appl. 2010;37(12):8659–66.

    Article  Google Scholar 

  8. Lekshmi S, Selvam V, Rajasekaran MP. EEG signal classification using Principal Component Analysis and Wavelet Transform with Neural Network. In: 2014 International conference on communications and signal processing (ICCSP), 2014. IEEE, pp 687–690

  9. Sharma P, Khan YU, Farooq O, Tripathi M, Adeli H. A wavelet-statistical features approach for nonconvulsive seizure detection. Clin EEG Neurosci. 2014;45(4):274–84.

    Article  Google Scholar 

  10. Ocak H. Optimal classification of epileptic seizures in EEG using wavelet analysis and genetic algorithm. Signal Process. 2008;88(7):1858–67.

    Article  Google Scholar 

  11. Hussein R, Palangi H, Ward R, Wang ZJ (2018) Epileptic seizure detection: a deep learning approach. arXiv preprint arXiv:180309848

  12. Raghu S, Sriraam N. Classification of focal and non-focal EEG signals using neighborhood component analysis and machine learning algorithms. Expert Syst Appl. 2018;113:18–32.

    Article  Google Scholar 

  13. Mutlu AY. Detection of epileptic dysfunctions in EEG signals using Hilbert vibration decomposition. Biomed Signal Process Control. 2018;40:33–40.

    Article  Google Scholar 

  14. Yuan S, Zhou W, Chen L. Epileptic seizure prediction using diffusion distance and bayesian linear discriminate analysis on intracranial EEG. Int J Neural Syst. 2018;28(01):1750043.

    Article  Google Scholar 

  15. Kannathal N, Choo ML, Acharya UR, Sadasivan P. Entropies for detection of epilepsy in EEG. Comput Methods Prog Biomed. 2005;80(3):187–94.

    Article  CAS  Google Scholar 

  16. Chua KC, Chandran V, Acharya UR, Lim CM. Application of higher order spectra to identify epileptic EEG. J Med Syst. 2011;35(6):1563–71.

    Article  Google Scholar 

  17. Acharya UR, Sree SV, Suri JS. Automatic detection of epileptic EEG signals using higher order cumulant features. Int J Neural Syst. 2011;21(05):403–14.

    Article  Google Scholar 

  18. Zhang Y, Zhou W, Yuan S. Multifractal analysis and relevance vector machine-based automatic seizure detection in intracranial EEG. Int J Neural Syst. 2015;25(06):1550020.

    Article  Google Scholar 

  19. Li X, Polygiannakis J, Kapiris P, Peratzakis A, Eftaxias K, Yao X. Fractal spectral analysis of pre-epileptic seizures in terms of criticality. J Neural Eng. 2005;2(2):11.

    Article  Google Scholar 

  20. Geng S, Zhou W, Yuan Q, Cai D, Zeng Y. EEG non-linear feature extraction using correlation dimension and Hurst exponent. Neurol Res. 2011;33(9):908–12.

    Article  Google Scholar 

  21. Güler NF, Übeyli ED, Güler I. Recurrent neural networks employing Lyapunov exponents for EEG signals classification. Expert Syst Appl. 2005;29(3):506–14.

    Article  Google Scholar 

  22. Bonn University EEG Data Set. http://epileptologie-bonn.de/cms/front_content.php?idcat=193&lang=3&changelang=3.

  23. Andrzejak RG, Lehnertz K, Mormann F, Rieke C, David P, Elger CE. Indications of nonlinear deterministic and finite-dimensional structures in time series of brain electrical activity: dependence on recording region and brain state. Phys Rev E. 2001;64(6):061907.

    Article  CAS  Google Scholar 

  24. Accardo A, Affinito M, Carrozzi M, Bouquet F. Use of the fractal dimension for the analysis of electroencephalographic time series. Biol Cybern. 1997;77(5):339–50.

    Article  CAS  Google Scholar 

  25. Higuchi T. Approach to an irregular time series on the basis of the fractal theory. Physica D. 1988;31(2):277–83.

    Article  Google Scholar 

  26. Katz MJ. Fractals and the analysis of waveforms. Comput Biol Med. 1988;18(3):145–56.

    Article  CAS  Google Scholar 

  27. Racine R. Estimating the Hurst exponent. Zurich: Mosaic Group; 2011.

    Google Scholar 

  28. Valdiviezo-N JC, Castro R, Cristóbal G, Carbone A. Hurst exponent for fractal characterization of LANDSAT images. In: SPIE Optical Engineering+ Applications, 2014. International Society for Optics and Photonics, pp 922103–922103–922109

  29. Kaplan I (2003) Estimating the Hurst exponent.

  30. Acharya R, Faust O, Kannathal N, Chua T, Laxminarayan S. Non-linear analysis of EEG signals at various sleep stages. Comput Methods Programs Biomed. 2005;80(1):37–45.

    Article  Google Scholar 

  31. Balli T, Palaniappan R. Classification of biological signals using linear and nonlinear features. Physiol Meas. 2010;31(7):903.

    Article  CAS  Google Scholar 

  32. Aboy M, Hornero R, Abásolo D, Álvarez D. Interpretation of the Lempel–Ziv complexity measure in the context of biomedical signal analysis. IEEE Trans Biomed Eng. 2006;53(11):2282–8.

    Article  Google Scholar 

  33. Lempel A, Ziv J. On the complexity of finite sequences. IEEE Trans Inf Theory. 1976;22(1):75–81.

    Article  Google Scholar 

  34. Kononenko I, Šimec E, Robnik-Šikonja M. Overcoming the myopia of inductive learning algorithms with RELIEFF. Appl Intell. 1997;7(1):39–55.

    Article  Google Scholar 

  35. Robnik-Šikonja M, Kononenko I. Theoretical and empirical analysis of ReliefF and RReliefF. Mach Learn. 2003;53(1–2):23–69.

    Article  Google Scholar 

  36. Subasi A, Ercelebi E. Classification of EEG signals using neural network and logistic regression. Comput Methods Programs Biomed. 2005;78(2):87–99.

    Article  Google Scholar 

  37. Kerdegari H, Samsudin K, Ramli AR, Mokaram S Evaluation of fall detection classification approaches. In: 2012 4th International Conference on Intelligent and Advanced Systems (ICIAS), 2012. IEEE, pp 131–136

  38. Law M (2006) A simple introduction to support vector machines. Lecture for CSE 802

  39. Chang C-C, Lin C-J. LIBSVM: a library for support vector machines. ACM Trans Intell Syst Technol (TIST). 2011;2(3):27.

    Google Scholar 

  40. LIBSVM. https://www.csie.ntu.edu.tw/~cjlin/libsvm/

  41. Orhan U, Hekim M, Ozer M. EEG signals classification using the K-means clustering and a multilayer perceptron neural network model. Expert Syst Appl. 2011;38(10):13475–81.

    Article  Google Scholar 

  42. Acharya UR, Molinari F, Sree SV, Chattopadhyay S, Ng K-H, Suri JS. Automated diagnosis of epileptic EEG using entropies. Biomed Signal Process Control. 2012;7(4):401–8.

    Article  Google Scholar 

  43. Murugavel AM, Ramakrishnan S. Hierarchical multi-class SVM with ELM kernel for epileptic EEG signal classification. Med Biol Eng Compu. 2016;54(1):149–61.

    Article  Google Scholar 

  44. Acharya UR, Sree SV, Chattopadhyay S, Yu W, Ang PCA. Application of recurrence quantification analysis for the automated identification of epileptic EEG signals. Int J Neural Syst. 2011;21(03):199–211.

    Article  Google Scholar 

  45. Wang Y, Zhou W, Yuan Q, Li X, Meng Q, Zhao X, Wang J. Comparison of ictal and interictal EEG signals using fractal features. Int J Neural Syst. 2013;23(06):1350028.

    Article  Google Scholar 

  46. Guo L, Rivero D, Dorado J, Rabunal JR, Pazos A. Automatic epileptic seizure detection in EEGs based on line length feature and artificial neural networks. J Neurosci Methods. 2010;191(1):101–9.

    Article  Google Scholar 

  47. Kumar Y, Dewal M, Anand R. Epileptic seizures detection in EEG using DWT-based ApEn and artificial neural network. SIViP. 2014;8(7):1323–34.

    Article  Google Scholar 

  48. Nicolaou N, Georgiou J. Detection of epileptic electroencephalogram based on permutation entropy and support vector machines. Expert Syst Appl. 2012;39(1):202–9.

    Article  Google Scholar 

  49. Kaya Y, Uyar M, Tekin R, Yıldırım S. 1D-local binary pattern based feature extraction for classification of epileptic EEG signals. Appl Math Comput. 2014;243:209–19.

    Google Scholar 

Download references

Acknowledgements

Not applicable.

Funding

No funding was obtained for this study.

Author information

Authors and Affiliations

Authors

Contributions

AT and MRD contributed to the design of the research, interpretation of the results and to the writing of the manuscript. AT contributed to the analysis of the data and implementation of the research. MRD supervised the work. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Mohammad Reza Daliri.

Ethics declarations

Ethics approval and consent to participate:

This article does not contain any studies with human participants or animals performed by any of the authors.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Torabi, A., Daliri, M.R. Applying nonlinear measures to the brain rhythms: an effective method for epilepsy diagnosis. BMC Med Inform Decis Mak 21, 270 (2021). https://doi.org/10.1186/s12911-021-01631-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12911-021-01631-6

Keywords