Generisch-Net: A Generic Deep Model for Analyzing Human Motion with Wearable Sensors in the Internet of Health Things
<p>Distribution of classes for different datasets shown here.</p> "> Figure 2
<p>The complete pipeline of the project includes a neural network. IMUs of smartphones and smartwatches are used for data collection. Through segmentation, the long signals are decomposed and passed to Generisch-Net. It has bi-directional GRU layers, followed by three inception-like modules.</p> "> Figure 3
<p>Precision (<span class="html-italic">P</span>), recall (<span class="html-italic">R</span>) and F1-score (<math display="inline"><semantics> <msub> <mi>F</mi> <mn>1</mn> </msub> </semantics></math>) for the WISDM 2019 HAR dataset.</p> "> Figure 4
<p>Confusion matrix for WISDM19 for best case of Generisch-Net.</p> "> Figure 5
<p>Precision (<span class="html-italic">P</span>), recall (<span class="html-italic">R</span>) and F1-score (<math display="inline"><semantics> <msub> <mi>F</mi> <mn>1</mn> </msub> </semantics></math>) for the WISDM 2011 HAR dataset.</p> "> Figure 6
<p>Confusion matrix for WISDM11 for best case of Generisch-Net.</p> "> Figure 7
<p>Precision (<span class="html-italic">P</span>), recall (<span class="html-italic">R</span>) and F1-score (<math display="inline"><semantics> <msub> <mi>F</mi> <mn>1</mn> </msub> </semantics></math>) for the Re-ID dataset for the best case of Generisch-Net.</p> "> Figure 8
<p>Confusion matrices: the first row shows the confusion matrix for the closed-access Re-ID dataset for the best case of Generisch-Net, and the second row shows zoomed into it for Class 22–29 (<b>a</b>) and Class 32–38 (<b>b</b>).</p> "> Figure 9
<p>Precision (<span class="html-italic">P</span>), recall (<span class="html-italic">R</span>) and F1-score (<math display="inline"><semantics> <msub> <mi>F</mi> <mn>1</mn> </msub> </semantics></math> for the Closed-Access Emotions Dataset in the best case of Generisch-Net.</p> "> Figure 10
<p>Confusion matrix for closed-access emotions dataset for the best case of Generisch-Net.</p> ">
Abstract
:1. Introduction
- We introduce Generisch-Net, a novel generic BiGRUs–convolutional neural network (BiGRU-CNN) designed to analyze human motion using wearable IMUs, such as those found in smartwatches and smartphones. This model has been trained for human activity recognition (HAR), human emotion recognition (HER), and Re-ID (Re-ID) tasks (see Section 4).
- The proposed model has been validated across three datasets, achieving average accuracies of 96.97% for HAR, 93.71% for Person Re-ID, and 78.20% for HER (see Section 5).
- A comparative analysis with existing state-of-the-art application-specific methodologies is provided to justify our approach (see Section 6).
2. Literature Review
3. Datasets
3.1. Datasets for HAR
3.1.1. WISDM 2011 Dataset
3.1.2. WISDM 2019 Dataset
3.2. Closed-Access Emotions Dataset
3.3. Closed-Access Re-ID Dataset
4. Methodology
4.1. Signal Segmentation
4.2. Generisch-Net
5. Results
5.1. HAR
5.1.1. WISDM 2019
5.1.2. WISDM 2011
5.2. Re-ID
5.3. HER
5.4. Computational Efficiency
6. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Puri, V.; Kataria, A.; Sharma, V. Artificial intelligence-powered decentralized framework for Internet of Things in Healthcare 4.0. Trans. Emerg. Telecommun. Technol. 2024, 35, e4245. [Google Scholar] [CrossRef]
- Zhang, M.; Sawchuk, A.A. Human Daily Activity Recognition With Sparse Representation Using Wearable Sensors. IEEE J. Biomed. Health Inform. 2013, 17, 553–560. [Google Scholar] [CrossRef] [PubMed]
- Davila, J.C.; Cretu, A.M.; Zaremba, M. Wearable sensor data classification for human activity recognition based on an iterative learning framework. Sensors 2017, 17, 1287. [Google Scholar] [CrossRef]
- Filippeschi, A.; Schmitz, N.; Miezal, M.; Bleser, G.; Ruffaldi, E.; Stricker, D. Survey of motion tracking methods based on inertial sensors: A focus on upper limb human motion. Sensors 2017, 17, 1257. [Google Scholar] [CrossRef] [PubMed]
- Imran, H.A.; Riaz, Q.; Hussain, M.; Tahir, H.; Arshad, R. Smart-Wearable Sensors and CNN-BiGRU Model: A Powerful Combination for Human Activity Recognition. IEEE Sens. J. 2024, 24, 1963–1974. [Google Scholar] [CrossRef]
- Challa, S.K.; Kumar, A.; Semwal, V.B.; Dua, N. An optimized deep learning model for human activity recognition using inertial measurement units. Expert Syst. 2023, 40, e13457. [Google Scholar] [CrossRef]
- Czekaj, Ł.; Kowalewski, M.; Domaszewicz, J.; Kitłowski, R.; Szwoch, M.; Duch, W. Real-Time Sensor-Based Human Activity Recognition for eFitness and eHealth Platforms. Sensors 2024, 24, 3891. [Google Scholar] [CrossRef]
- Zhao, Y.; Guo, M.; Chen, X.; Sun, J.; Qiu, J. Attention-based CNN fusion model for emotion recognition during walking using discrete wavelet transform on EEG and inertial signals. Big Data Min. Anal. 2023, 7, 188–204. [Google Scholar] [CrossRef]
- Imran, H.A.; Riaz, Q.; Zeeshan, M.; Hussain, M.; Arshad, R. Machines Perceive Emotions: Identifying Affective States from Human Gait Using On-Body Smart Devices. Appl. Sci. 2023, 13, 4728. [Google Scholar] [CrossRef]
- Gohar, I.; Riaz, Q.; Shahzad, M.; Zeeshan Ul Hasnain Hashmi, M.; Tahir, H.; Ehsan Ul Haq, M. Person re-identification using deep modeling of temporally correlated inertial motion patterns. Sensors 2020, 20, 949. [Google Scholar] [CrossRef]
- Müller, P.N.; Müller, A.J.; Achenbach, P.; Göbel, S. IMU-Based Fitness Activity Recognition Using CNNs for Time Series Classification. Sensors 2024, 24, 742. [Google Scholar] [CrossRef]
- Yan, J.; Toyoura, M.; Wu, X. Identification of a Person in a Trajectory Based on Wearable Sensor Data Analysis. Sensors 2024, 24, 3680. [Google Scholar] [CrossRef] [PubMed]
- Baklouti, S.; Chaker, A.; Rezgui, T.; Sahbani, A.; Bennour, S.; Laribi, M.A. A Novel IMU-Based System for Work-Related Musculoskeletal Disorders Risk Assessment. Sensors 2024, 24, 3419. [Google Scholar] [CrossRef] [PubMed]
- Diraco, G.; Rescio, G.; Caroppo, A.; Manni, A.; Leone, A. Human Action Recognition in Smart Living Services and Applications: Context Awareness, Data Availability, Personalization, and Privacy. Sensors 2023, 23, 6040. [Google Scholar] [CrossRef] [PubMed]
- Mohamed, S.A.; Martinez-Hernandez, U. A light-weight artificial neural network for recognition of activities of daily living. Sensors 2023, 23, 5854. [Google Scholar] [CrossRef]
- Bailo, G.; Saibene, F.L.; Bandini, V.; Arcuri, P.; Salvatore, A.; Meloni, M.; Castagna, A.; Navarro, J.; Lencioni, T.; Ferrarin, M.; et al. Characterization of Walking in Mild Parkinson’s Disease: Reliability, Validity and Discriminant Ability of the Six-Minute Walk Test Instrumented with a Single Inertial Sensor. Sensors 2024, 24, 662. [Google Scholar] [CrossRef]
- Zhang, H.; Xiao, Z.; Wang, J.; Li, F.; Szczerbicki, E. A novel IoT-perceptive human activity recognition (HAR) approach using multihead convolutional attention. IEEE Internet Things J. 2019, 7, 1072–1080. [Google Scholar] [CrossRef]
- Xia, K.; Huang, J.; Wang, H. LSTM-CNN architecture for human activity recognition. IEEE Access 2020, 8, 56855–56866. [Google Scholar] [CrossRef]
- Pang, H.; Zheng, L.; Fang, H. Cross-Attention Enhanced Pyramid Multi-Scale Networks for Sensor-based Human Activity Recognition. IEEE J. Biomed. Health Inform. 2024, 28, 2733–2744. [Google Scholar] [CrossRef]
- Li, X.; Nie, L.; Si, X.; Ding, R.; Zhan, D. Enhancing Representation of Deep Features for Sensor-Based Activity Recognition. Mob. Netw. Appl. 2021, 26, 130–145. [Google Scholar] [CrossRef]
- Nafea, O.; Abdul, W.; Muhammad, G.; Alsulaiman, M. Sensor-Based Human Activity Recognition with Spatio-Temporal Deep Learning. Sensors 2021, 21, 2141. [Google Scholar] [CrossRef] [PubMed]
- Qin, Z.; Zhang, Y.; Meng, S.; Qin, Z.; Choo, K.K.R. Imaging and fusing time series for wearable sensor-based human activity recognition. Inf. Fusion 2020, 53, 80–87. [Google Scholar] [CrossRef]
- Zhang, Z.; Song, Y.; Cui, L.; Liu, X.; Zhu, T. Emotion recognition based on customized smart bracelet with built-in accelerometer. PeerJ 2016, 4, e2258. [Google Scholar] [CrossRef] [PubMed]
- Piskioulis, O.; Tzafilkou, K.; Economides, A. Emotion Detection through Smartphone’s Accelerometer and Gyroscope Sensors. In Proceedings of the 29th ACM Conference on User Modeling, Adaptation and Personalization, Utrecht, The Netherlands, 21–25 June 2021; pp. 130–137. [Google Scholar]
- Reyana, A.; Vijayalakshmi, P.; Kautish, S. Multisensor fusion approach: A case study on human physiological factor-based emotion recognition and classification. Int. J. Comput. Appl. Technol. 2021, 66, 107–114. [Google Scholar] [CrossRef]
- Quiroz, J.C.; Yong, M.H.; Geangu, E. Emotion-recognition using smart watch accelerometer data: Preliminary findings. In Proceedings of the 2017 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2017 ACM International Symposium on Wearable Computers, Maui, HI, USA, 11–15 September 2017; pp. 805–812. [Google Scholar]
- Hashmi, M.A.; Riaz, Q.; Zeeshan, M.; Shahzad, M.; Fraz, M.M. Motion Reveal Emotions: Identifying Emotions from Human Walk Using Chest Mounted Smartphone. IEEE Sens. J. 2020, 20, 13511–13522. [Google Scholar] [CrossRef]
- Zou, Q.; Wang, Y.; Wang, Q.; Zhao, Y.; Li, Q. Deep learning-based gait recognition using smartphones in the wild. IEEE Trans. Inf. Forensics Secur. 2020, 15, 3197–3212. [Google Scholar] [CrossRef]
- Qiu, S.; Liu, L.; Zhao, H.; Wang, Z.; Jiang, Y. MEMS inertial sensors based gait analysis for rehabilitation assessment via multi-sensor fusion. Micromachines 2018, 9, 442. [Google Scholar] [CrossRef]
- Ahmed, A.; Roumeliotis, S. A visual-inertial approach to human gait estimation. In Proceedings of the 2018 IEEE International Conference on Robotics and Automation (ICRA), Brisbane, Australia, 21–25 May 2018; pp. 4614–4621. [Google Scholar]
- Lockhart, J.W.; Weiss, G.M.; Xue, J.C.; Gallagher, S.T.; Grosner, A.B.; Pulickal, T.T. Design considerations for the WISDM smart phone-based sensor mining architecture. In Proceedings of the Fifth International Workshop on Knowledge Discovery from Sensor Data, San Diego, CA, USA, 21 August 2011; pp. 25–33. [Google Scholar]
- Weiss, G.M.; Yoneda, K.; Hayajneh, T. Smartphone and smartwatch-based biometrics using activities of daily living. IEEE Access 2019, 7, 133190–133202. [Google Scholar] [CrossRef]
- Ignatov, A. Real-time human activity recognition from accelerometer data using Convolutional Neural Networks. Appl. Soft Comput. 2018, 62, 915–922. [Google Scholar] [CrossRef]
- Ihianle, I.K.; Nwajana, A.O.; Ebenuwa, S.H.; Otuka, R.I.; Owa, K.; Orisatoki, M.O. A deep learning approach for human activities recognition from multimodal sensing devices. IEEE Access 2020, 8, 179028–179038. [Google Scholar] [CrossRef]
Window Size (w) | Step Size (s) | Accuracy (%) |
---|---|---|
WISDM 11 | ||
128 | 64 | 93.54 |
256 | 64 | 94.15 |
256 | 32 | 95.08 |
WISDM 19 | ||
128 | 64 | 83.76 |
256 | 64 | 87.87 |
256 | 32 | 93.65 |
256 | 16 | 96.58 |
Closed-access Emotions | ||
128 | 64 | 39.63 |
256 | 64 | 38.21 |
256 | 32 | 59.15 |
256 | 16 | 78.63 |
Closed-access Re-ID | ||
128 | 64 | 77.09 |
256 | 64 | 87.22 |
256 | 32 | 93.12 |
Type & Reference | Accuracy (%) |
---|---|
WISDM 2011 dataset | |
[18] LSTM-CNN | 95.85 |
[33] CNN | 93.32 |
[17] CNN with an attention mechanism | 96.4 |
[5] CNN-BiGRU with Direct-link | 98.81 |
Presented model | 95.624 |
WISDM 2019 dataset | |
[34] MCBLSTM | 96.6 ± 1.47 |
[32] KNN, DT, RF | 94.4 |
[5] CNN-BiGRU with Direct-link | 98.4 |
Presented Model | 96.978 |
Closed-Access Emotions Dataset | |
[27] Traditional ML | 86.45 |
[9] CNN-BiGRU with Raw-link | 95 |
Presented model | 78.198 |
Closed-Access Re-identification Dataset | |
[10] BiGRU | 86.23 |
Presented model | 93.713 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Hamza, K.; Riaz, Q.; Imran, H.A.; Hussain, M.; Krüger, B. Generisch-Net: A Generic Deep Model for Analyzing Human Motion with Wearable Sensors in the Internet of Health Things. Sensors 2024, 24, 6167. https://doi.org/10.3390/s24196167
Hamza K, Riaz Q, Imran HA, Hussain M, Krüger B. Generisch-Net: A Generic Deep Model for Analyzing Human Motion with Wearable Sensors in the Internet of Health Things. Sensors. 2024; 24(19):6167. https://doi.org/10.3390/s24196167
Chicago/Turabian StyleHamza, Kiran, Qaiser Riaz, Hamza Ali Imran, Mehdi Hussain, and Björn Krüger. 2024. "Generisch-Net: A Generic Deep Model for Analyzing Human Motion with Wearable Sensors in the Internet of Health Things" Sensors 24, no. 19: 6167. https://doi.org/10.3390/s24196167
APA StyleHamza, K., Riaz, Q., Imran, H. A., Hussain, M., & Krüger, B. (2024). Generisch-Net: A Generic Deep Model for Analyzing Human Motion with Wearable Sensors in the Internet of Health Things. Sensors, 24(19), 6167. https://doi.org/10.3390/s24196167