Classification of Individual Finger Movements from Right Hand Using fNIRS Signals
<p>(<b>A</b>) Experimental setup; (<b>B</b>) optodes arrangement; (<b>C</b>) overcap to reduce external light; (<b>D</b>) optodes holder.</p> "> Figure 2
<p>Experimental paradigm visualization. Single experiment consists of three sessions of each finger tapping trail. Single trial consists of 10 s task and 10 s finger tapping.</p> "> Figure 3
<p>(<b>A</b>) Source-detector placement over motor cortex. <a href="#sensors-21-07943-f003" class="html-fig">Figure 3</a>A Colour code: Red (source), Blue (detector), Green (channels), and black colour represent channel numbers. (<b>B</b>) Demonstration of total haemoglobin changes over motor cortex during index finger tapping.</p> "> Figure 4
<p>Comparison of different classifiers on basis of performance parameters (accuracy, precision, recall F1score).</p> "> Figure 5
<p>Confusion metrics for all classifiers for subject one (S01); Classes are labeled as ‘0’, ‘1’, ‘2’, ‘3’, ‘4’ and ‘5’, which stands for ‘Rest’, ‘Thumb’, ‘Index’, ‘Middle’, ‘Ring’, and ‘Little’ finger-tapping classes, respectively. (<b>a</b>) Quadratic discriminant analysis (QDA). (<b>b</b>) AdaBoost. (<b>c</b>) Support vector machine (SVM). (<b>d</b>) Decision tree (DT). (<b>e</b>) Artificial neural networks (ANN). (<b>f</b>) k-nearest neighbors (kNN). (<b>g</b>) Random forest (RF). (<b>h</b>) Extreme Gradient Boosting (XGBoost).</p> "> Figure 6
<p>Oxygenated haemoglobin Signal for complete experimental trail.</p> ">
Abstract
:1. Introduction
2. Materials and Methods
2.1. Participants
2.2. Instrumentation
2.3. Experimental Setup and Instructions
2.4. Experimental Design
2.5. Brain Area and Montage Selection
2.6. Signal Prepossessing
2.7. Modified Beer–Lambert Law (MBLL)
2.8. Signal Filtration
2.9. Feature Extraction
2.10. Classification
2.11. Performance Evaluation
3. Results and Discussion
4. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Abbreviations
fNIRS | Functional Near-Infrared Spectroscopy |
SVM | Support Vector Machine |
RF | Random forest |
DT | Decision Tree |
QDA | Quadratic Discriminant Analysis |
ANN | Artificial Neural Networks (ANN) |
KNN | K-Nearest Neighbors (kNN) |
References
- Izzetoglu, M.; Izzetoglu, K.; Bunce, S.; Ayaz, H.; Devaraj, A.; Onaral, B.; Pourrezaei, K. Functional near-infrared neuroimaging. IEEE Trans. Neural Syst. Rehabil. Eng. 2005, 13, 153–159. [Google Scholar] [CrossRef]
- Boas, D.A.; Elwell, C.E.; Ferrari, M.; Taga, G. Twenty years of functional near-infrared spectroscopy: Introduction for the special issue. NeuroImage 2014, 85, 1–5. [Google Scholar] [CrossRef] [PubMed]
- Khan, R.A.; Naseer, N.; Qureshi, N.K.; Noori, F.M.; Nazeer, H.; Khan, M.U. fNIRS-based Neurorobotic Interface for gait rehabilitation. J. Neuroeng. Rehabil. 2018, 15, 1–17. [Google Scholar] [CrossRef] [PubMed]
- Khan, H.; Naseer, N.; Yazidi, A.; Eide, P.K.; Hassan, H.W.; Mirtaheri, P. Analysis of Human Gait using Hybrid EEG-fNIRS-based BCI System: A review. Front. Hum. Neurosci. 2020, 14, 605. [Google Scholar] [CrossRef] [PubMed]
- Villringer, A.; Chance, B. Non-invasive optical spectroscopy and imaging of human brain function. Trends Neurosci. 1997, 20, 435–442. [Google Scholar] [CrossRef]
- Huneau, C.; Benali, H.; Chabriat, H. Investigating human neurovascular coupling using functional neuroimaging: A critical review of dynamic models. Front. Neurosci. 2015, 9, 467. [Google Scholar] [CrossRef] [Green Version]
- Hendrikx, D.; Smits, A.; Lavanga, M.; De Wel, O.; Thewissen, L.; Jansen, K.; Caicedo, A.; Van Huffel, S.; Naulaers, G. Measurement of neurovascular coupling in neonates. Front. Physiol. 2019, 10, 65. [Google Scholar] [CrossRef] [Green Version]
- Kumar, V.; Shivakumar, V.; Chhabra, H.; Bose, A.; Venkatasubramanian, G.; Gangadhar, B.N. Functional near infra-red spectroscopy (fNIRS) in schizophrenia: A review. Asian J. Psychiatry 2017, 27, 18–31. [Google Scholar] [CrossRef] [PubMed]
- Naseer, N.; Hong, K.S. fNIRS-based brain-computer interfaces: A review. Front. Hum. Neurosci. 2015, 9, 3. [Google Scholar] [CrossRef] [Green Version]
- Naseer, N.; Qureshi, N.K.; Noori, F.M.; Hong, K.S. Analysis of different classification techniques for two-class functional near-infrared spectroscopy-based brain-computer interface. Comput. Intell. Neurosci. 2016, 2016, 5480760. [Google Scholar] [CrossRef] [Green Version]
- Jobbágy, Á.; Harcos, P.; Karoly, R.; Fazekas, G. Analysis of finger-tapping movement. J. Neurosci. Methods 2005, 141, 29–39. [Google Scholar] [CrossRef]
- Liao, K.; Xiao, R.; Gonzalez, J.; Ding, L. Decoding Individual Finger Movements from One Hand Using Human EEG Signals. PLoS ONE 2014, 9, e85192. [Google Scholar] [CrossRef]
- Kondo, G.; Kato, R.; Yokoi, H.; Arai, T. Classification of individual finger motions hybridizing electromyogram in transient and converged states. In Proceedings of the 2010 IEEE International Conference on Robotics and Automation, Anchorage, AK, USA, 3–7 May 2010; pp. 2909–2915. [Google Scholar]
- Al-Timemy, A.H.; Bugmann, G.; Escudero, J.; Outram, N. Classification of finger movements for the dexterous hand prosthesis control with surface electromyography. IEEE J. Biomed. Health Inform. 2013, 17, 608–618. [Google Scholar] [CrossRef]
- Sikdar, S.; Rangwala, H.; Eastlake, E.B.; Hunt, I.A.; Nelson, A.J.; Devanathan, J.; Shin, A.; Pancrazio, J.J. Novel method for predicting dexterous individual finger movements by imaging muscle activity using a wearable ultrasonic system. IEEE Trans. Neural Syst. Rehabil. Eng. 2013, 22, 69–76. [Google Scholar] [CrossRef] [PubMed]
- Samiee, S.; Hajipour, S.; Shamsollahi, M.B. Five-class finger flexion classification using ECoG signals. In Proceedings of the 2010 International Conference on Intelligent and Advanced Systems, Kuala Lumpur, Malaysia, 15–17 June 2010; pp. 1–4. [Google Scholar]
- Flamary, R.; Rakotomamonjy, A. Decoding Finger Movements from ECoG Signals Using Switching Linear Models. Front. Neurosci. 2012, 6, 29. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Nazeer, H.; Naseer, N.; Khan, R.A.A.; Noori, F.M.; Qureshi, N.K.; Khan, U.S.; Khan, M.J. Enhancing classification accuracy of fNIRS-BCI using features acquired from vector-based phase analysis. J. Neural Eng. 2020, 17, 056025. [Google Scholar] [CrossRef] [PubMed]
- Bak, S.; Park, J.; Shin, J.; Jeong, J. Open-access fNIRS dataset for classification of unilateral finger-and foot-tapping. Electronics 2019, 8, 1486. [Google Scholar] [CrossRef] [Green Version]
- Holper, L.; Wolf, M. Single-trial classification of motor imagery differing in task complexity: A functional near-infrared spectroscopy study. J. Neuroeng. Rehabil. 2011, 8, 1–13. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Zafar, A.; Hong, K.S. Reduction of onset delay in functional near-infrared spectroscopy: Prediction of HbO/HbR signals. Front. Neurorobotics 2020, 14, 10. [Google Scholar] [CrossRef] [Green Version]
- Wickramaratne, S.D.; Mahmud, M. Conditional-GAN Based Data Augmentation for Deep Learning Task Classifier Improvement Using fNIRS Data. Front. Big Data 2021, 4, 62. [Google Scholar] [CrossRef]
- Sommer, N.M.; Kakillioglu, B.; Grant, T.; Velipasalar, S.; Hirshfield, L. Classification of fNIRS Finger Tapping Data With Multi-Labeling and Deep Learning. IEEE Sens. J. 2021, 21, 24558–24569. [Google Scholar] [CrossRef]
- Kashou, N.H.; Giacherio, B.M.; Nahhas, R.W.; Jadcherla, S.R. Hand-grasping and finger tapping induced similar functional near-infrared spectroscopy cortical responses. Neurophotonics 2016, 3, 025006. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Anwar, A.R.; Muthalib, M.; Perrey, S.; Galka, A.; Granert, O.; Wolff, S.; Heute, U.; Deuschl, G.; Raethjen, J.; Muthuraman, M. Effective connectivity of cortical sensorimotor networks during finger movement tasks: A simultaneous fNIRS, fMRI, EEG study. Brain Topogr. 2016, 29, 645–660. [Google Scholar] [CrossRef] [PubMed]
- Vergotte, G.; Torre, K.; Chirumamilla, V.C.; Anwar, A.R.; Groppa, S.; Perrey, S.; Muthuraman, M. Dynamics of the human brain network revealed by time-frequency effective connectivity in fNIRS. Biomed. Opt. Express 2017, 8, 5326–5341. [Google Scholar] [CrossRef] [Green Version]
- Cicalese, P.A.; Li, R.; Ahmadi, M.B.; Wang, C.; Francis, J.T.; Selvaraj, S.; Schulz, P.E.; Zhang, Y. An EEG-fNIRS hybridization technique in the four-class classification of alzheimer’s disease. J. Neurosci. Methods 2020, 336, 108618. [Google Scholar] [CrossRef] [PubMed]
- Hong, K.S.; Khan, M.J. Hybrid brain–computer interface techniques for improved classification accuracy and increased number of commands: A review. Front. Neurorobotics 2017, 11, 35. [Google Scholar] [CrossRef] [Green Version]
- Quaresima, V.; Ferrari, M. Functional near-infrared spectroscopy (fNIRS) for assessing cerebral cortex function during human behavior in natural/social situations: A concise review. Organ. Res. Methods 2019, 22, 46–68. [Google Scholar] [CrossRef]
- Yücel, M.A.; Lühmann, A.V.; Scholkmann, F.; Gervain, J.; Dan, I.; Ayaz, H.; Boas, D.; Cooper, R.J.; Culver, J.; Elwell, C.E.; et al. Best practices for fNIRS publications. Neurophotonics 2021, 8, 012101. [Google Scholar] [CrossRef]
- Khan, M.A.; Bhutta, M.R.; Hong, K.S. Task-specific stimulation duration for fNIRS brain-computer interface. IEEE Access 2020, 8, 89093–89105. [Google Scholar] [CrossRef]
- Santosa, H.; Zhai, X.; Fishburn, F.; Huppert, T. The NIRS brain AnalyzIR toolbox. Algorithms 2018, 11, 73. [Google Scholar] [CrossRef] [Green Version]
- Pinti, P.; Scholkmann, F.; Hamilton, A.; Burgess, P.; Tachtsidis, I. Current Status and Issues Regarding Pre-processing of fNIRS Neuroimaging Data: An Investigation of Diverse Signal Filtering Methods Within a General Linear Model Framework. Front. Hum. Neurosci. 2019, 12, 505. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Rahman, M.A.; Rashid, M.A.; Ahmad, M. Selecting the optimal conditions of Savitzky–Golay filter for fNIRS signal. Biocybern. Biomed. Eng. 2019, 39, 624–637. [Google Scholar] [CrossRef]
- Hong, K.S.; Khan, M.J.; Hong, M.J. Feature extraction and classification methods for hybrid fNIRS-EEG brain-computer interfaces. Front. Hum. Neurosci. 2018, 12, 246. [Google Scholar] [CrossRef]
- Naseer, N.; Noori, F.M.; Qureshi, N.K.; Hong, K.S. Determining optimal feature-combination for LDA classification of functional near-infrared spectroscopy signals in brain-computer interface application. Front. Hum. Neurosci. 2016, 10, 237. [Google Scholar] [CrossRef] [Green Version]
- Noori, F.M.; Naseer, N.; Qureshi, N.K.; Nazeer, H.; Khan, R.A. Optimal feature selection from fNIRS signals using genetic algorithms for BCI. Neurosci. Lett. 2017, 647, 61–66. [Google Scholar] [CrossRef] [PubMed]
- Qureshi, N.K.; Naseer, N.; Noori, F.M.; Nazeer, H.; Khan, R.A.; Saleem, S. Enhancing classification performance of functional near-infrared spectroscopy-brain–computer interface using adaptive estimation of general linear model coefficients. Front. Neurorobotics 2017, 11, 33. [Google Scholar] [CrossRef] [PubMed]
- Elkan, C. Evaluating Classifiers; University of California: San Diego, CA, USA, 2012. [Google Scholar]
- Jorge, A.; Royston, D.A.; Tyler-Kabara, E.C.; Boninger, M.L.; Collinger, J.L. Classification of individual finger movements using intracortical recordings in Human Motor Cortex. Neurosurgery 2020, 87, 630–638. [Google Scholar] [CrossRef] [PubMed]
- Power, S.D.; Kushki, A.; Chau, T. Automatic single-trial discrimination of mental arithmetic, mental singing and the no-control state from prefrontal activity: Toward a three-state NIRS-BCI. BMC Res. Notes 2012, 5, 141. [Google Scholar] [CrossRef] [Green Version]
- Hong, K.S.; Naseer, N.; Kim, Y.H. Classification of prefrontal and motor cortex signals for three-class fNIRS-BCI. Neurosci. Lett. 2015, 587, 87–92. [Google Scholar] [CrossRef]
- Hong, K.S.; Santosa, H. Decoding four different sound-categories in the auditory cortex using functional near-infrared spectroscopy. Hear. Res. 2016, 333, 157–166. [Google Scholar] [CrossRef]
- Kamran, M.A.; Jeong, M.Y.; Mannan, M. Optimal hemodynamic response model for functional near-infrared spectroscopy. Front. Behav. Neurosci. 2015, 9, 151. [Google Scholar] [CrossRef] [Green Version]
- Ho, T.K.K.; Gwak, J.; Park, C.M.; Song, J.I. Discrimination of mental workload levels from multi-channel fNIRS using deep leaning-based approaches. IEEE Access 2019, 7, 24392–24403. [Google Scholar] [CrossRef]
- Wu, S.; Li, J.; Gao, L.; Chen, C.; He, S. Suppressing systemic interference in fNIRS monitoring of the hemodynamic cortical response to motor execution and imagery. Front. Hum. Neurosci. 2018, 12, 85. [Google Scholar] [CrossRef] [PubMed]
- Hu, X.S.; Hong, K.S.; Ge, S.S. Reduction of trial-to-trial variability in functional near-infrared spectroscopy signals by accounting for resting-state functional connectivity. J. Biomed. Opt. 2013, 18, 017003. [Google Scholar] [CrossRef] [PubMed]
- Naseer, N.; Hong, K.S. Functional near-infrared spectroscopy based brain activity classification for development of a brain-computer interface. In Proceedings of the 2012 International Conference of Robotics and Artificial Intelligence, Rawalpindi, Pakistan, 22–23 October 2012; pp. 174–178. [Google Scholar] [CrossRef]
- Khan, M.J.; Hong, K.S.; Bhutta, M.R.; Naseer, N. fNIRS based dual movement control command generation using prefrontal brain activity. In Proceedings of the 2014 International Conference on Robotics and Emerging Allied Technologies in Engineering (iCREATE), Islamabad, Pakistan, 22–24 April 2014; pp. 244–248. [Google Scholar] [CrossRef]
- Xiao, J.; Xu, H.; Gao, H.; Bian, M.; Li, Y. A Weakly Supervised Semantic Segmentation Network by Aggregating Seed Cues: The Multi-Object Proposal Generation Perspective. ACM J. 2021, 17, 1–19. [Google Scholar] [CrossRef]
- Hoshi, Y.; Kobayashi, N.; Tamura, M. Interpretation of near-infrared spectroscopy signals: A study with a newly developed perfused rat brain model. J. Appl. Physiol. 2001, 90, 1657–1662. [Google Scholar] [CrossRef] [Green Version]
- Hu, X.S.; Hong, K.S.; Shuzhi, S.G.; Jeong, M.Y. Kalman estimator-and general linear model-based on-line brain activation mapping by near-infrared spectroscopy. Biomed. Eng. Online 2010, 9, 1–15. [Google Scholar] [CrossRef] [Green Version]
- Zafar, A.; Hong, K.S. Neuronal activation detection using vector phase analysis with dual threshold circles: A functional near-infrared spectroscopy study. Int. J. Neural Syst. 2018, 28, 1850031. [Google Scholar] [CrossRef]
- Khan, M.N.A.; Hong, K.S. Most favorable stimulation duration in the sensorimotor cortex for fNIRS-based BCI. Biomed. Opt. Express 2021, 12, 5939–5954. [Google Scholar] [CrossRef]
Wavelength (nm) | DPF (cm) | (1/cm) (moles/L) | |
---|---|---|---|
760 | 7.25 | 1466.5865 | 3843.707 |
850 | 6.38 | 2526.391 | 1798.643 |
Sr. No. | Statistical Feature | Mathematical Formulation/Description |
---|---|---|
1. | Signal Mean | Signal mean is calculated as: where, : Mean of window w: sample window Number of sample in the window : Lower limit of the window Upper limit of the window : Stands for or |
2. | Signal Peak (Signal maximum) | The feature select the maximum value in the window. |
3. | Signal Minimum | The feature minimum value in the window. |
4. | Signal Skewness | Signal skewness is calculated as: where, is the expectation, is the mean, and is the standard deviation of the haemoglobin |
5. | Signal Kurtosis | Signal Kurtosis is calculated as: where, is the expectation, is the mean, and is the standard deviation of the haemoglobin |
6. | Signal Variance | Signal variance is the measure of signal spread. |
7. | Signal Median | Median is the value separating the higher half from the lower half of values in the time window. |
8. | Peak-to-peak | Peak-to-peak is computed as the difference between the maximum to the minimum value in the time window. |
Classifiers | Parameters Setting |
---|---|
QDA | priors = None, reg_param = 0.0 |
AdaBoost | n_estimator = 10, random_state = 0, learning_rate = 1.0 |
SVM | Kernal = rbf, degree = 3, random_state = None |
ANN | hidden layers = (5, 2), solver=’lbfgs’, random_state = 1, max_liter = 300, |
Decision Tree | criterion = entropy, random_state = 0 |
kNN | n_neighbors = 5 |
Random Forest | n_estimators = 10, criterion = entropy, random_state = 0 |
XGBoost | booster = gbtree, verbosity = 1, nthread = maximum number of threads |
S01 | S02 | S03 | S04 | S05 | S06 | S07 | S08 | S09 | S10 | S11 | S12 | S13 | S14 | S15 | S16 | S17 | S18 | S19 | S20 | S21 | S22 | S23 | S24 | Mean | STD | ||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
SVM | Accuracy | 0.58 | 0.57 | 0.58 | 0.57 | 0.56 | 0.57 | 0.58 | 0.59 | 0.58 | 0.57 | 0.57 | 0.62 | 0.57 | 0.57 | 0.64 | 0.57 | 0.59 | 0.58 | 0.65 | 0.60 | 0.57 | 0.58 | 0.60 | 0.59 | 0.59 | 0.02 |
Precision | 0.65 | 0.32 | 0.34 | 0.48 | 0.32 | 0.49 | 0.34 | 0.62 | 0.53 | 0.39 | 0.41 | 0.46 | 0.48 | 0.41 | 0.67 | 0.40 | 0.35 | 0.43 | 0.49 | 0.44 | 0.45 | 0.50 | 0.52 | 0.65 | 0.46 | 0.10 | |
Recall | 0.58 | 0.57 | 0.58 | 0.57 | 0.56 | 0.57 | 0.58 | 0.59 | 0.58 | 0.57 | 0.57 | 0.62 | 0.57 | 0.57 | 0.64 | 0.57 | 0.59 | 0.58 | 0.65 | 0.60 | 0.57 | 0.58 | 0.60 | 0.59 | 0.59 | 0.02 | |
F1 Score | 0.47 | 0.41 | 0.43 | 0.42 | 0.41 | 0.42 | 0.43 | 0.45 | 0.45 | 0.43 | 0.41 | 0.49 | 0.43 | 0.42 | 0.55 | 0.42 | 0.44 | 0.43 | 0.52 | 0.45 | 0.42 | 0.44 | 0.50 | 0.46 | 0.45 | 0.04 | |
RF | Accuracy | 0.84 | 0.65 | 0.84 | 0.70 | 0.73 | 0.77 | 0.75 | 0.75 | 0.75 | 0.76 | 0.73 | 0.73 | 0.72 | 0.80 | 0.78 | 0.72 | 0.71 | 0.75 | 0.82 | 0.68 | 0.77 | 0.77 | 0.78 | 0.78 | 0.75 | 0.05 |
Precision | 0.84 | 0.63 | 0.85 | 0.70 | 0.73 | 0.77 | 0.75 | 0.75 | 0.75 | 0.77 | 0.73 | 0.73 | 0.72 | 0.80 | 0.80 | 0.73 | 0.70 | 0.75 | 0.82 | 0.67 | 0.77 | 0.78 | 0.78 | 0.78 | 0.75 | 0.05 | |
Recall | 0.84 | 0.65 | 0.84 | 0.70 | 0.73 | 0.77 | 0.75 | 0.75 | 0.75 | 0.76 | 0.73 | 0.73 | 0.72 | 0.80 | 0.78 | 0.72 | 0.71 | 0.75 | 0.82 | 0.68 | 0.77 | 0.77 | 0.78 | 0.78 | 0.75 | 0.05 | |
F1 Score | 0.83 | 0.61 | 0.83 | 0.67 | 0.72 | 0.75 | 0.73 | 0.74 | 0.74 | 0.75 | 0.70 | 0.72 | 0.70 | 0.78 | 0.77 | 0.70 | 0.69 | 0.73 | 0.81 | 0.65 | 0.75 | 0.76 | 0.77 | 0.77 | 0.74 | 0.05 | |
DT | Accuracy | 0.79 | 0.56 | 0.76 | 0.28 | 0.67 | 0.68 | 0.23 | 0.68 | 0.71 | 0.70 | 0.63 | 0.71 | 0.65 | 0.73 | 0.76 | 0.71 | 0.67 | 0.69 | 0.76 | 0.64 | 0.72 | 0.71 | 0.75 | 0.71 | 0.66 | 0.13 |
Precision | 0.79 | 0.56 | 0.76 | 0.49 | 0.67 | 0.69 | 0.53 | 0.68 | 0.71 | 0.70 | 0.63 | 0.72 | 0.65 | 0.74 | 0.76 | 0.71 | 0.68 | 0.69 | 0.78 | 0.64 | 0.72 | 0.72 | 0.75 | 0.71 | 0.69 | 0.07 | |
Recall | 0.79 | 0.56 | 0.76 | 0.28 | 0.67 | 0.68 | 0.23 | 0.68 | 0.71 | 0.70 | 0.63 | 0.71 | 0.65 | 0.73 | 0.76 | 0.71 | 0.67 | 0.69 | 0.76 | 0.64 | 0.72 | 0.71 | 0.75 | 0.71 | 0.66 | 0.13 | |
F1 Score | 0.79 | 0.56 | 0.75 | 0.32 | 0.67 | 0.69 | 0.27 | 0.68 | 0.71 | 0.70 | 0.63 | 0.71 | 0.65 | 0.74 | 0.76 | 0.71 | 0.67 | 0.69 | 0.77 | 0.64 | 0.72 | 0.71 | 0.75 | 0.71 | 0.67 | 0.13 | |
AdaBoost | Accuracy | 0.41 | 0.55 | 0.46 | 0.56 | 0.52 | 0.52 | 0.39 | 0.50 | 0.48 | 0.52 | 0.51 | 0.42 | 0.43 | 0.51 | 0.43 | 0.38 | 0.49 | 0.45 | 0.49 | 0.53 | 0.45 | 0.38 | 0.34 | 0.46 | 0.47 | 0.06 |
Precision | 0.40 | 0.41 | 0.46 | 0.33 | 0.46 | 0.43 | 0.39 | 0.44 | 0.39 | 0.39 | 0.44 | 0.46 | 0.38 | 0.44 | 0.46 | 0.38 | 0.44 | 0.44 | 0.53 | 0.41 | 0.41 | 0.41 | 0.47 | 0.45 | 0.43 | 0.04 | |
Recall | 0.41 | 0.55 | 0.46 | 0.56 | 0.52 | 0.52 | 0.39 | 0.50 | 0.48 | 0.52 | 0.51 | 0.42 | 0.43 | 0.51 | 0.43 | 0.38 | 0.49 | 0.45 | 0.49 | 0.53 | 0.45 | 0.38 | 0.34 | 0.46 | 0.47 | 0.06 | |
F1 Score | 0.40 | 0.44 | 0.46 | 0.42 | 0.46 | 0.45 | 0.38 | 0.46 | 0.43 | 0.43 | 0.45 | 0.43 | 0.40 | 0.46 | 0.43 | 0.37 | 0.45 | 0.44 | 0.50 | 0.45 | 0.42 | 0.39 | 0.38 | 0.43 | 0.43 | 0.03 | |
QDA | Accuracy | 0.28 | 0.22 | 0.31 | 0.28 | 0.24 | 0.42 | 0.23 | 0.41 | 0.20 | 0.21 | 0.24 | 0.28 | 0.32 | 0.25 | 0.58 | 0.32 | 0.31 | 0.30 | 0.36 | 0.26 | 0.34 | 0.24 | 0.56 | 0.28 | 0.31 | 0.10 |
Precision | 0.59 | 0.49 | 0.66 | 0.49 | 0.56 | 0.48 | 0.53 | 0.50 | 0.55 | 0.52 | 0.45 | 0.59 | 0.61 | 0.51 | 0.59 | 0.49 | 0.54 | 0.56 | 0.64 | 0.54 | 0.49 | 0.54 | 0.69 | 0.47 | 0.54 | 0.06 | |
Recall | 0.28 | 0.22 | 0.31 | 0.28 | 0.24 | 0.42 | 0.23 | 0.41 | 0.20 | 0.21 | 0.24 | 0.28 | 0.32 | 0.25 | 0.58 | 0.32 | 0.31 | 0.30 | 0.36 | 0.26 | 0.34 | 0.24 | 0.56 | 0.28 | 0.31 | 0.10 | |
F1 Score | 0.29 | 0.25 | 0.33 | 0.32 | 0.24 | 0.43 | 0.27 | 0.42 | 0.16 | 0.22 | 0.26 | 0.30 | 0.33 | 0.28 | 0.57 | 0.35 | 0.33 | 0.30 | 0.42 | 0.30 | 0.38 | 0.23 | 0.58 | 0.31 | 0.33 | 0.10 | |
ANN | Accuracy | 0.61 | 0.58 | 0.60 | 0.57 | 0.58 | 0.58 | 0.58 | 0.60 | 0.60 | 0.58 | 0.58 | 0.63 | 0.57 | 0.58 | 0.64 | 0.59 | 0.60 | 0.61 | 0.67 | 0.61 | 0.59 | 0.59 | 0.62 | 0.59 | 0.60 | 0.02 |
Precision | 0.69 | 0.42 | 0.56 | 0.54 | 0.48 | 0.54 | 0.34 | 0.69 | 0.67 | 0.62 | 0.61 | 0.54 | 0.60 | 0.52 | 0.60 | 0.52 | 0.52 | 0.60 | 0.64 | 0.57 | 0.56 | 0.62 | 0.58 | 0.58 | 0.57 | 0.08 | |
Recall | 0.61 | 0.58 | 0.60 | 0.57 | 0.58 | 0.58 | 0.58 | 0.60 | 0.60 | 0.58 | 0.58 | 0.63 | 0.57 | 0.58 | 0.64 | 0.59 | 0.60 | 0.61 | 0.67 | 0.61 | 0.59 | 0.59 | 0.62 | 0.59 | 0.60 | 0.02 | |
F1 Score | 0.52 | 0.43 | 0.48 | 0.45 | 0.44 | 0.44 | 0.43 | 0.48 | 0.49 | 0.44 | 0.45 | 0.53 | 0.46 | 0.44 | 0.54 | 0.46 | 0.46 | 0.50 | 0.59 | 0.48 | 0.48 | 0.48 | 0.55 | 0.48 | 0.48 | 0.04 | |
kNN | Accuracy | 0.80 | 0.65 | 0.78 | 0.71 | 0.69 | 0.77 | 0.74 | 0.74 | 0.74 | 0.73 | 0.72 | 0.74 | 0.72 | 0.78 | 0.78 | 0.70 | 0.73 | 0.76 | 0.82 | 0.68 | 0.77 | 0.76 | 0.79 | 0.77 | 0.75 | 0.04 |
Precision | 0.80 | 0.63 | 0.78 | 0.69 | 0.68 | 0.76 | 0.74 | 0.74 | 0.73 | 0.72 | 0.72 | 0.73 | 0.71 | 0.78 | 0.78 | 0.70 | 0.72 | 0.77 | 0.81 | 0.66 | 0.76 | 0.76 | 0.79 | 0.77 | 0.74 | 0.05 | |
Recall | 0.80 | 0.65 | 0.78 | 0.71 | 0.69 | 0.77 | 0.74 | 0.74 | 0.74 | 0.73 | 0.72 | 0.74 | 0.72 | 0.78 | 0.78 | 0.70 | 0.73 | 0.76 | 0.82 | 0.68 | 0.77 | 0.76 | 0.79 | 0.77 | 0.75 | 0.04 | |
F1 Score | 0.79 | 0.62 | 0.78 | 0.69 | 0.68 | 0.76 | 0.73 | 0.73 | 0.73 | 0.72 | 0.70 | 0.73 | 0.70 | 0.78 | 0.77 | 0.69 | 0.71 | 0.75 | 0.82 | 0.66 | 0.76 | 0.76 | 0.79 | 0.77 | 0.73 | 0.05 | |
XGBoost | Accuracy | 0.86 | 0.64 | 0.86 | 0.71 | 0.74 | 0.78 | 0.74 | 0.77 | 0.78 | 0.79 | 0.71 | 0.76 | 0.73 | 0.82 | 0.80 | 0.75 | 0.75 | 0.77 | 0.86 | 0.68 | 0.81 | 0.78 | 0.84 | 0.79 | 0.77 | 0.06 |
Precision | 0.87 | 0.62 | 0.86 | 0.72 | 0.74 | 0.79 | 0.74 | 0.78 | 0.79 | 0.79 | 0.72 | 0.76 | 0.73 | 0.83 | 0.80 | 0.76 | 0.75 | 0.77 | 0.85 | 0.66 | 0.82 | 0.79 | 0.84 | 0.79 | 0.77 | 0.06 | |
Recall | 0.86 | 0.64 | 0.86 | 0.71 | 0.74 | 0.78 | 0.74 | 0.77 | 0.78 | 0.79 | 0.71 | 0.76 | 0.73 | 0.82 | 0.80 | 0.75 | 0.75 | 0.77 | 0.86 | 0.68 | 0.81 | 0.78 | 0.84 | 0.79 | 0.77 | 0.06 | |
F1 Score | 0.86 | 0.58 | 0.85 | 0.69 | 0.72 | 0.77 | 0.72 | 0.76 | 0.76 | 0.77 | 0.69 | 0.75 | 0.72 | 0.81 | 0.78 | 0.73 | 0.73 | 0.75 | 0.85 | 0.64 | 0.80 | 0.77 | 0.83 | 0.78 | 0.75 | 0.07 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Khan, H.; Noori, F.M.; Yazidi, A.; Uddin, M.Z.; Khan, M.N.A.; Mirtaheri, P. Classification of Individual Finger Movements from Right Hand Using fNIRS Signals. Sensors 2021, 21, 7943. https://doi.org/10.3390/s21237943
Khan H, Noori FM, Yazidi A, Uddin MZ, Khan MNA, Mirtaheri P. Classification of Individual Finger Movements from Right Hand Using fNIRS Signals. Sensors. 2021; 21(23):7943. https://doi.org/10.3390/s21237943
Chicago/Turabian StyleKhan, Haroon, Farzan M. Noori, Anis Yazidi, Md Zia Uddin, M. N. Afzal Khan, and Peyman Mirtaheri. 2021. "Classification of Individual Finger Movements from Right Hand Using fNIRS Signals" Sensors 21, no. 23: 7943. https://doi.org/10.3390/s21237943
APA StyleKhan, H., Noori, F. M., Yazidi, A., Uddin, M. Z., Khan, M. N. A., & Mirtaheri, P. (2021). Classification of Individual Finger Movements from Right Hand Using fNIRS Signals. Sensors, 21(23), 7943. https://doi.org/10.3390/s21237943