Locomotion Mode Recognition Algorithm Based on Gaussian Mixture Model Using IMU Sensors
<p>IMU sensors mounted on the lower extremity. 4 IMUs attached to the thigh and foot, respectively. The MTw Awinda sensors communicate with the computer using Awinda USB Dongle.</p> "> Figure 2
<p>A participant walked the various terrain at the gait speeds corresponding to each BPM 90/110/130 on the over-ground. The terrain was sequentially level walking, stair ascent, stair descent, ramp ascent, and ramp descent (left to right direction).</p> "> Figure 3
<p>Overview diagram of data analysis. The process comprises the data acquisition, pre-processing, machine-learning classifier, and locomotion mode. Additionally, the pre-processing comprises the estimated hip angle problem, feature selection and extraction, labeling, and the feature scaling.</p> "> Figure 4
<p>Estimated hip angle and foot pitch angle data for five terrains (LW/SA/SD/RA/RD) for principal component analysis.</p> "> Figure 5
<p>A classification strategy for gaussian mixture model (GMM) algorithm. The strategy comprises two layers. (<b>a1</b>) The first layer classifies the stair terrain (i.e., 1st recognizer). In this layer, the LW/RA/RD terrain is classified into the same class. (<b>a2</b>) Representative figure of the first layer. The class is classified as follows: (LW, RA, RD-1/SA-2/SD-3) (<b>b1</b>) The second layer classifies the level walking and ramp terrain (i.e., 2nd recognizer). (<b>b2</b>) Representative figure of the second layer. The class is classified as follows: (LW-1/RA-2/RD-3).</p> "> Figure 6
<p>The data of the principal components (PC) on a three-dimensional plane according to beats per minute (BPM) 90/110/130. The terrain is displayed in colors like a legend. (<b>a</b>) Full-dependent model case. The subjects are shown as different signs. (<b>b</b>) Individual-dependent model case. The data of each subject are arranged in the order of the row.</p> ">
Abstract
:1. Introduction
2. Methods
2.1. Sensor Systems
2.2. Experimental Protocol
- People walked correctly according to BPM rhythms on the over-ground.
- These were the data of the walking state without obstacles.
3. Locomotion Mode Recognition (LMR) Algorithm
3.1. Pre-Processing
3.2. Machine-Learning Classifier
3.2.1. Gaussian Mixture Model (GMM)
3.2.2. Expectation-Maximization (EM) Algorithm
3.3. Locomotion Mode Classifier
3.4. Classification Strategy for GMM Algorithm
3.5. Performance Evaluation
4. Results
4.1. Results of Data Analysis
4.2. Results for the Confusion Matrix According to the BPM
5. Discussion
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Abbreviations
ADL | Activities of Daily Living |
LMR | Locomotion Mode Recognition |
GMM | Gaussian Mixture Model |
BPM | Beats per Minute |
PC | Principal Component |
LOOCV | Leave-One Out Cross-Validation |
References
- Life Expectancy. Available online: https://www.oecd-ilibrary.org/sites/40e1c86c-en/index.html?itemId=/content/component/40e1c86c-en (accessed on 4 January 2021).
- Carmeli, E.; Imam, B.; Merrick, J. The relationship of pre-sarcopenia (low muscle mass) and sarcopenia (loss of muscle strength) with functional decline in individuals with intellectual disability (ID). Arch. Gerontol. Geriatr. 2012, 55, 181–185. [Google Scholar] [CrossRef] [PubMed]
- Seven Activities of Daily Living. Available online: https://www.sevenshomecare.com/services/7-activities-of-daily-living/ (accessed on 4 January 2021).
- Al-dabbagh, A.H.; Ronsse, R. A review of terrain detection systems for applications in locomotion assistance. Robot. Auton. Syst. 2020, 103628. [Google Scholar] [CrossRef]
- Jang, J.; Kim, K.; Lee, J.; Lim, B.; Cho, J.K.; Shim, Y. Preliminary study of online gait recognizer for lower limb exoskeletons. In Proceedings of the 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vancouver, BC, Canada, 24–28 September 2017; pp. 5818–5824. [Google Scholar]
- Gong, C.; Xu, D.; Zhou, Z.; Vitiello, N.; Wang, Q. BPNN-Based Real-Time Recognition of Locomotion Modes for an Active Pelvis Orthosis with Different Assistive Strategies. Int. J. Humanoid Robot. 2020, 17, 2050004. [Google Scholar] [CrossRef]
- Chen, B.; Zheng, E.; Wang, Q. A locomotion intent prediction system based on multi-sensor fusion. Sensors 2014, 14, 12349–12369. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Shahmoradi, S.; Shouraki, S.B. A fuzzy sequential locomotion mode recognition system for lower limb prosthesis control. In Proceedings of the 2017 Iranian Conference on Electrical Engineering (ICEE), Tehran, Iran, 2–4 May 2017; pp. 2153–2158. [Google Scholar]
- Figueiredo, J.; Carvalho, S.P.; Gonçalve, D.; Moreno, J.C.; Santos, C.P. Daily Locomotion Recognition and Prediction: A Kinematic Data-Based Machine Learning Approach. IEEE Access 2020, 8, 33250–33262. [Google Scholar] [CrossRef]
- Sherratt, F.; Plummer, A.; Iravani, P. Understanding LSTM Network Behaviour of IMU-Based Locomotion Mode Recognition for Applications in Prostheses and Wearables. Sensors 2021, 21, 1264. [Google Scholar] [CrossRef] [PubMed]
- Gao, F.; Liu, G.; Liang, F.; Liao, W.H. IMU-Based locomotion mode identification for transtibial prostheses, orthoses, and exoskeletons. IEEE Trans. Neural Syst. Rehabil. Eng. 2020, 28, 1334–1343. [Google Scholar] [CrossRef] [PubMed]
- Han, Y.; Liu, C.; Yan, L.; Ren, L. Design of Decision Tree Structure with Improved BPNN Nodes for High-Accuracy Locomotion Mode Recognition Using a Single IMU. Sensors 2021, 21, 526. [Google Scholar] [CrossRef] [PubMed]
- Rubio-Solis, A.; Panoutsos, G.; Beltran-Perez, C.; Martinez-Hernandez, U. A multilayer interval type-2 fuzzy extreme learning machine for the recognition of walking activities and gait events using wearable sensors. Neurocomputing 2020, 389, 42–55. [Google Scholar] [CrossRef]
- Tiwari, A.; Joshi, D. An infrared sensor-based instrumented shoe for gait events detection on different terrains and transitions. IEEE Sens. J. 2020, 20, 10779–10791. [Google Scholar] [CrossRef]
- Digo, E.; Agostini, V.; Pastorelli, S.; Gastaldi, L.; Panero, E. Gait Phases Detection in Elderly using Trunk-MIMU System. In Proceedings of the 14th International Joint Conference on Biomedical Engineering Systems and Technologies (BIOSTEC 2021), Vienna, Austria, 11–13 February 2021; pp. 58–65. [Google Scholar]
- Shin, D.B.; Lee, S.C.; Hwang, S.H.; Baek, I.H.; No, J.K.; Hwang, S.W.; Han, C.S. Development of the Algorithm of Locomotion Modes Decision based on RBF-SVM for Hip Gait Assist Robot. J. Korean Soc. Precis. Eng. 2020, 37, 187–194. [Google Scholar] [CrossRef]
- Wang, W.F.; Lien, W.C.; Liu, C.Y.; Yang, C.Y. Study on tripping risks in fast walking through cadence-controlled gait analysis. J. Healthc. Eng. 2018, 2018, 2723178. [Google Scholar] [CrossRef] [PubMed]
- Ducharme, S.W.; Sands, C.J.; Moore, C.C.; Aguiar, E.J.; Hamill, J.; Tudor-Locke, C. Changes to gait speed and the walk ratio with rhythmic auditory cuing. Gait Posture 2018, 66, 255–259. [Google Scholar] [CrossRef] [PubMed]
- Schimpl, M.; Moore, C.; Lederer, C.; Neuhaus, A.; Sambrook, J.; Danesh, J.; Ouwehand, W.; Daumer, M. Association between walking speed and age in healthy, free-living individuals using mobile accelerometry—A cross-sectional study. PLoS ONE 2011, 6, e23299. [Google Scholar] [CrossRef] [PubMed]
- Wheelchair Ramp Information. Available online: https://www.brainline.org/article/wheelchair-ramp-information (accessed on 4 January 2021).
- Alonge, F.; Cucco, E.; D’Ippolito, F.; Pulizzotto, A. The use of accelerometers and gyroscopes to estimate hip and knee angles on gait analysis. Sensors 2014, 14, 8430–8446. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Watanabe, T.; Saito, H.; Koike, E.; Nitta, K. A preliminary test of measurement of joint angles and stride length with wireless inertial sensors for wearable gait evaluation system. Comput. Intell. Neurosci. 2011, 2011, 975193. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Watanabe, T.; Teruyama, Y.; Ohashi, K. Comparison of angle measurements between integral-based and quaternion-based methods using inertial sensors for gait evaluation. In Proceedings of the International Joint Conference on Biomedical Engineering Systems and Technologies, Angers, France, 3–6 March 2014; pp. 274–288. [Google Scholar]
- Takeda, R.; Lisco, G.; Fujisawa, T.; Gastaldi, L.; Tohyama, H.; Tadano, S. Drift removal for improving the accuracy of gait parameters using wearable sensor systems. Sensors 2014, 14, 23230–23247. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Jeong, D.H.; Ziemkiewicz, C.; Fisher, B.; Ribarsky, W.; Chang, R. ipca: An interactive system for pca-based visual analytics. In Computer Graphics Forum; Wiley Online Library: Hoboken, NJ, USA, 2009; Volume 28, pp. 767–774. [Google Scholar]
- Constantinopoulos, C.; Titsias, M.K.; Likas, A. Bayesian feature and model selection for Gaussian mixture models. IEEE Trans. Pattern Anal. Mach. Intell. 2006, 28, 1013–1018. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Brunton, S.L.; Kutz, J.N. Data-Driven Science and Engineering: Machine Learning, Dynamical Systems, and Control; Cambridge University Press: Cambridge, UK, 2019. [Google Scholar]
- Bishop, C.M. Pattern Recognition and Machine Learning; Springer: Berlin/Heidelberg, Germany, 2006. [Google Scholar]
- Difference between Machine Learning and Artificial Intelligence. Available online: https://www.geeksforgeeks.org/difference-between-machine-learning-and-artificial-intelligence/ (accessed on 4 January 2021).
- Janecek, A.; Gansterer, W.; Demel, M.; Ecker, G. On the relationship between feature selection and classification accuracy. In New Challenges for Feature Selection in Data Mining and Knowledge Discovery, Proceedings of Machine Learning Research; PMLR: Graz, Austria, 2008; pp. 90–105. [Google Scholar]
- Moore, A.W. Cross-Validation for Detecting and Preventing Overfitting; School of Computer Science Carneigie Mellon University: Pittsburgh, PA, USA, 2001. [Google Scholar]
- Standard Maximum Ramp Slope. Available online: https://inspectapedia.com/Stairs/Access_Ramp_Slope.php (accessed on 6 April 2021).
No. | 1 | 2 | 3 | 4 |
---|---|---|---|---|
Sex | M | M | M | F |
Age | 32 | 31 | 23 | 33 |
Height [cm] | 161 | 173 | 176 | 165 |
Weight [kg] | 60 | 73 | 71 | 62 |
Terrain | Label |
---|---|
Level Walking (LW) | 1 |
Stair Ascent (SA) | 2 |
Stair Descent (SD) | 3 |
Ramp Ascent (RA) | 4 |
Ramp Descent (RD) | 5 |
No. | Recognizer | BPM 90 (1.03 m/s) | BPM 110 (1.34 m/s) | BPM 130 (1.57 m/s) |
---|---|---|---|---|
Full-dependent model | ||||
All | 1st | 98.75% | 99.33% | 98.39% |
2nd | 95.78% | 95.75% | 87.54% | |
Consumption time (ms) | 14.5 | 21.1 | 14 | |
Individual-dependent model | ||||
Subject 1 | 1st | 100% | 100% | 100% |
2nd | 96.5% | 96.5% | 95.71% | |
Subject 2 | 1st | 99.37% | 99.71% | 99.34% |
2nd | 97.32% | 96.45% | 93.75% | |
Subject 3 | 1st | 98.81% | 98.61% | 93.51% |
2nd | 94.3% | 94.58% | 98.01% | |
Subject 4 | 1st | 100% | 100% | 100% |
2nd | 95.65% | 96.59% | 96.59% | |
Total | 1st | 99.55 ± 0.5% | 99.58 ± 0.57% | 98.21 ± 2.73% |
2nd | 95.94 ± 1.11% | 96.03 ± 0.84% | 96.02 ± 1.54% | |
Consumption time (ms) | 8 ± 6.68 | 7 ± 8.66 | 2.4 ± 1.14 |
Reference | Year | Sensor | Placement | No. of Activity | NO. of Subjects | Inclination Angle (Ramp Site) | Classifier | Accuracy | Computation Time (ms) |
---|---|---|---|---|---|---|---|---|---|
[Sensors] | |||||||||
Proposed method | 2021 | 4 IMUs | 2 thigh, 2 foot | 5 | 4 healthy | GMM | 99.33% (SA/SD) 95.75% (LW/RA/RD) | 21.1 | |
[7] | 2014 | 2 IMUs, 2 FSR | 1 thigh, 1 shank, 2 foot | 5 | 7 healthy | LDA | 99.71 ± 0.05% | - | |
[8] | 2017 | 3 IMUs, 1 FSR | 1 thigh, 1 shank, 2 foot | 7 | 4 healthy | Fuzzy sequential pattern recognition /HMM | 95.8% /86.5% | - | |
[9] | 2020 | 7 IMUs | 1 torso, 2 thigh, 2 shank, 2 foot | 5 | 10 healthy | Gaussian SVM | 99.8 ± 0.3% | - | |
[11] | 2020 | 1 IMU | 1 heel | 5 | 3 healthy, 3 amputee | Elliptical boundary | 98.5% | - | |
[12] | 2021 | 1 IMU | 1 knee joint | 6 | 6 healthy | IBPNN- DTS | 97.29% | - | |
[10] | 2021 | 5 IMUs | 1 chest, 2 hip joints, 2 ankle joints | 5 | 22 healthy | - | LSTM | Above 95% | - |
[Hip exoskeleton robot + Sensors] | |||||||||
[5] | 2017 | 3 IMUs, 2 encoders | 1 torso, 2 hip joints, 2 ankle joints | 5 | 5 healthy | Above | RBF-SVM | 99.3% (LW/SA/SD) 95.45% (RA/RD) | - |
[6] | 2020 | 2 IMUs, 2 encoders | 2 thigh, 2 hip joints | 6 | 3 healthy | BPNN | 98.43% (zero-torque) 98.03% (assistive mode) | 0.9 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Shin, D.; Lee, S.; Hwang, S. Locomotion Mode Recognition Algorithm Based on Gaussian Mixture Model Using IMU Sensors. Sensors 2021, 21, 2785. https://doi.org/10.3390/s21082785
Shin D, Lee S, Hwang S. Locomotion Mode Recognition Algorithm Based on Gaussian Mixture Model Using IMU Sensors. Sensors. 2021; 21(8):2785. https://doi.org/10.3390/s21082785
Chicago/Turabian StyleShin, Dongbin, Seungchan Lee, and Seunghoon Hwang. 2021. "Locomotion Mode Recognition Algorithm Based on Gaussian Mixture Model Using IMU Sensors" Sensors 21, no. 8: 2785. https://doi.org/10.3390/s21082785
APA StyleShin, D., Lee, S., & Hwang, S. (2021). Locomotion Mode Recognition Algorithm Based on Gaussian Mixture Model Using IMU Sensors. Sensors, 21(8), 2785. https://doi.org/10.3390/s21082785