Improving Human Activity Recognition Through 1D-ResNet: A Wearable Wristband for 14 Workout Movements
<p>The sensing mechanism for workout recognition using the Z-axis acceleration of an IMU sensor using gravity.</p> "> Figure 2
<p>The wristband with IMU sensor and the red arrows are the Z-axis in the sensor coordinate system.</p> "> Figure 3
<p>The 14 workouts were chosen for recognition. (<b>a</b>) Bench press, (<b>b</b>) Incline bench press, (<b>c</b>) Dumbbell shoulder press, (<b>d</b>) Dumbbell triceps extension, (<b>e</b>) Dumbbell kick, (<b>f</b>) Dumbbell front raise, (<b>g</b>) Lat pull down, (<b>h</b>) Straight arm lat pull down, (<b>i</b>) Deadlift, (<b>j</b>) Dumbbell bent row, (<b>k</b>) One-arm dumbbell row, (<b>l</b>) EZ-bar curls, (<b>m</b>) Machine preacher curl, (<b>n</b>) Seated dumbbell lateral raise [<a href="#B30-processes-13-00207" class="html-bibr">30</a>].</p> "> Figure 4
<p>The main path and skip connection of residual block in ResNet.</p> "> Figure 5
<p>The architecture of 1D ResNet.</p> "> Figure 6
<p>The Z-axis acceleration data of bench press for subject 1.</p> "> Figure 7
<p>The accuracy of training and validation of (<b>a</b>) 1D ResNet and (<b>b</b>) 3D ResNet.</p> "> Figure 8
<p>The confusion matrix of workout recognition 1D ResNet.</p> "> Figure 9
<p>The confusion matrix of workout recognition 1D ResNet fresh test.</p> ">
Abstract
:1. Introduction
2. Materials and Methods
2.1. Z-Axis Acceleration Data for Workout Recognition
2.2. 14 Workout Data
2.3. 1D-ResNet
3. Results and Discussion
3.1. Evaluation of 1D ResNet
3.2. Comparison with Previous Research
4. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Data Availability Statement
Conflicts of Interest
References
- Fangbemi, A.S.; Liu, B.; Yu, N.H.; Zhang, Y. Efficient human action recognition interface for augmented and virtual reality applications based on binary descriptor. In Augmented Reality, Virtual Reality, and Computer Graphics, Proceedings of the 5th International Conference, AVR 2018, Otranto, Italy, 24–27 June 2018, Proceedings, Part I 5; Springer International Publishing: Cham, Switzerland, 2018; pp. 252–260. [Google Scholar]
- Xia, C.; Sugiura, Y. Optimizing sensor position with virtual sensors in human activity recognition system design. Sensors 2021, 21, 6893. [Google Scholar] [CrossRef] [PubMed]
- Xiao, F.; Pei, L.; Chu, L.; Zou, D.; Yu, W.; Zhu, Y.; Li, T. A deep learning method for complex human activity recognition using virtual wearable sensors. In Spatial Data and Intelligence, Proceedings of the First International Conference, SpatialDI 2020, Virtual Event, 8–9 May 2020, Proceedings 1; Springer International Publishing: Cham, Switzerland, 2021; pp. 261–270. [Google Scholar]
- Jeyakumar, J.V.; Lai, L.; Suda, N.; Srivastava, M. SenseHAR: A robust virtual activity sensor for smartphones and wearables. In Proceedings of the 17th Conference on Embedded Networked Sensor Systems, New York, NY, USA, 10–13 November 2019; pp. 15–28. [Google Scholar]
- Schuldhaus, D. Human Activity Recognition in Daily Life and Sports Using Inertial Sensors; FAU University Press: Boca Raton, FL, USA, 2019. [Google Scholar]
- Host, K.; Ivašić-Kos, M. An overview of Human Action Recognition in sports based on Computer Vision. Heliyon 2022, 8, e09633. [Google Scholar] [CrossRef] [PubMed]
- Pajak, G.; Krutz, P.; Patalas-Maliszewska, J.; Rehm, M.; Pajak, I.; Dix, M. An approach to sport activities recognition based on an inertial sensor and deep learning. Sens. Actuators A Phys. 2022, 345, 113773. [Google Scholar] [CrossRef]
- Bibbò, L.; Vellasco, M.M. Human activity recognition (HAR) in healthcare. Appl. Sci. 2023, 13, 13009. [Google Scholar] [CrossRef]
- Frank, A.E.; Kubota, A.; Riek, L.D. Wearable activity recognition for robust human-robot teaming in safety-critical environments via hybrid neural networks. In Proceedings of the 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Macau, China, 3–8 November 2019; pp. 449–454. [Google Scholar]
- Jaramillo, I.E.; Jeong, J.G.; Lopez, P.R.; Lee, C.-H.; Kang, D.-Y.; Ha, T.-J.; Oh, J.-H.; Jung, H.; Lee, J.H.; Lee, W.H. Real-time human activity recognition with IMU and encoder sensors in wearable exoskeleton robot via deep learning networks. Sensors 2022, 22, 9690. [Google Scholar] [CrossRef]
- Martínez-Villaseñor, L.; Ponce, H. A concise review on sensor signal acquisition and transformation applied to human activity recognition and human–robot interaction. Int. J. Distrib. Sens. Netw. 2019, 15, 1550147719853987. [Google Scholar] [CrossRef]
- Hoelzemann, A.; Romero, J.L.; Bock, M.; Laerhoven, K.V.; Lv, Q. Hang-time HAR: A benchmark dataset for basketball activity recognition using wrist-worn inertial sensors. Sensors 2023, 23, 5879. [Google Scholar] [CrossRef]
- Wang, Z.; Wu, D.; Chen, J.; Ghoneim, A.; Hossain, M.A. A triaxial accelerometer-based human activity recognition via EEMD-based features and game-theory-based feature selection. IEEE Sens. J. 2016, 16, 3198–3207. [Google Scholar] [CrossRef]
- Zhang, Z. Microsoft kinect sensor and its effect. IEEE Multimed. 2012, 19, 4–10. [Google Scholar] [CrossRef]
- Han, J.; Shao, L.; Xu, D.; Shotton, J. Enhanced computer vision with microsoft kinect sensor: A review. IEEE Trans. Cybern. 2013, 43, 1318–1334. [Google Scholar]
- Li, X.; He, Y.; Jing, X. A survey of deep learning-based human activity recognition in radar. Remote Sens. 2019, 11, 1068. [Google Scholar] [CrossRef]
- Zhu, S.; Guendel, R.G.; Yarovoy, A.; Fioranelli, F. Continuous human activity recognition with distributed radar sensor networks and CNN–RNN architectures. IEEE Trans. Geosci. Remote Sens. 2022, 60, 5115215. [Google Scholar] [CrossRef]
- Mohammadzadeh, F.F.; Liu, S.; Bond, K.A.; Nam, C.S. Feasibility of a wearable, sensor-based motion tracking system. Procedia Manuf. 2015, 3, 192–199. [Google Scholar] [CrossRef]
- Longo, U.G.; De Salvatore, S.; Sassi, M.; Carnevale, A.; De Luca, G.; Denaro, V. Motion tracking algorithms based on wearable inertial sensor: A focus on shoulder. Electronics 2022, 11, 1741. [Google Scholar] [CrossRef]
- Rana, M.; Mittal, V. Wearable sensors for real-time kinematics analysis in sports: A review. IEEE Sens. J. 2020, 21, 1187–1207. [Google Scholar] [CrossRef]
- Poitras, I.; Dupuis, F.; Bielmann, M.; Campeau-Lecours, A.; Mercier, C.; Bouyer, L.J.; Roy, J.-S. Validity and reliability of wearable sensors for joint angle estimation: A systematic review. Sensors 2019, 19, 1555. [Google Scholar] [CrossRef]
- Bakhshi, S.; Mahoor, M.H. Development of a wearable sensor system for measuring body joint flexion. In Proceedings of the 2011 International Conference on Body Sensor Networks, Dallas, TX, USA, 23–25 May 2011; pp. 35–40. [Google Scholar]
- Teague, C.N.; Heller, J.A.; Nevius, B.N.; Carek, A.M.; Mabrouk, S.; Garcia-Vicente, F.; Inan, O.T.; Etemadi, M. A wearable, multimodal sensing system to monitor knee joint health. IEEE Sens. J. 2020, 20, 10323–10334. [Google Scholar] [CrossRef]
- Zhao, H.; Ma, Y.; Wang, S.; Watson, A.; Zhou, G. MobiGesture: Mobility-aware hand gesture recognition for healthcare. Smart Health 2018, 9, 129–143. [Google Scholar] [CrossRef]
- Digo, E.; Polito, M.; Pastorelli, S.; Gastaldi, L. Detection of upper limb abrupt gestures for human–machine interaction using deep learning techniques. J. Braz. Soc. Mech. Sci. Eng. 2024, 46, 227. [Google Scholar] [CrossRef]
- Rivera, P.; Valarezo, E.; Choi, M.-T.; Kim, T.-S. Recognition of human hand activities based on a single wrist imu using recurrent neural networks. Int. J. Pharma Med. Biol. Sci 2017, 6, 114–118. [Google Scholar] [CrossRef]
- Ayvaz, U.; Elmoughni, H.; Atalay, A.; Atalay, Ö.; Ince, G. Real-time human activity recognition using textile-based sensors. In Proceedings of the EAI International Conference on Body Area Networks, Tallinn, Estonia, 25–26 December 2020; pp. 168–183. [Google Scholar]
- Zhang, S.; Li, Y.; Zhang, S.; Shahabi, F.; Xia, S.; Deng, Y.; Alshurafa, N. Deep learning in human activity recognition with wearable sensors: A review on advances. Sensors 2022, 22, 1476. [Google Scholar] [CrossRef] [PubMed]
- Mani, N.; Haridoss, P.; George, B. Evaluation of a Combined Conductive Fabric-Based Suspender System and Machine Learning Approach for Human Activity Recognition. IEEE Open J. Instrum. Meas. 2023, 2, 2500310. [Google Scholar] [CrossRef]
- Koo, B.; Nguyen, N.T.; Kim, J. Identification and Classification of Human Body Exercises on Smart Textile Bands by Combining Decision Tree and Convolutional Neural Networks. Sensors 2023, 23, 6223. [Google Scholar] [CrossRef] [PubMed]
- Shafiq, M.; Gu, Z. Deep residual learning for image recognition: A survey. Appl. Sci. 2022, 12, 8972. [Google Scholar] [CrossRef]
Subject | Height (cm) | Weight (kg) |
---|---|---|
Subject 1 | 170 | 75 |
Subject 2 | 183 | 80 |
Subject 3 | 180 | 78 |
Subject 4 | 173 | 80 |
Subject 5 | 174 | 74 |
Workout | Precision | Recall | F1-Score |
---|---|---|---|
(a) | 1 | 0.94 | 0.97 |
(b) | 0.93 | 0.93 | 0.93 |
(c) | 0.90 | 0.96 | 0.93 |
(d) | 1 | 1 | 1 |
(e) | 1 | 1 | 1 |
(f) | 1 | 1 | 1 |
(g) | 1 | 0.88 | 0.94 |
(h) | 0.86 | 0.95 | 0.90 |
(i) | 0.96 | 0.96 | 0.96 |
(j) | 1 | 1 | 1 |
(k) | 1 | 1 | 1 |
(l) | 1 | 1 | 1 |
(m) | 1 | 1 | 1 |
(n) | 1 | 1 | 1 |
0.98 * | 0.97 * | 0.97 * |
Workout | Precision | Recall | F1-Score |
---|---|---|---|
(a) | 0.81 | 0.93 | 0.87 |
(b) | 0.69 | 0.85 | 0.76 |
(c) | 0.89 | 0.53 | 0.67 |
(d) | 1 | 0.45 | 0.62 |
(e) | 1 | 1 | 1 |
(f) | 1 | 1 | 1 |
(g) | 0.75 | 1 | 0.86 |
(h) | 1 | 1 | 1 |
(i) | 1 | 1 | 1 |
(j) | 1 | 0.91 | 0.95 |
(k) | 0.93 | 1 | 0.97 |
(l) | 1 | 1 | 1 |
(m) | 0.68 | 1 | 0.81 |
(n) | 1 | 1 | 1 |
0.91 * | 0.91 * | 0.89 * |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Kim, S.-U.; Kim, J.-Y. Improving Human Activity Recognition Through 1D-ResNet: A Wearable Wristband for 14 Workout Movements. Processes 2025, 13, 207. https://doi.org/10.3390/pr13010207
Kim S-U, Kim J-Y. Improving Human Activity Recognition Through 1D-ResNet: A Wearable Wristband for 14 Workout Movements. Processes. 2025; 13(1):207. https://doi.org/10.3390/pr13010207
Chicago/Turabian StyleKim, Sang-Un, and Joo-Yong Kim. 2025. "Improving Human Activity Recognition Through 1D-ResNet: A Wearable Wristband for 14 Workout Movements" Processes 13, no. 1: 207. https://doi.org/10.3390/pr13010207
APA StyleKim, S.-U., & Kim, J.-Y. (2025). Improving Human Activity Recognition Through 1D-ResNet: A Wearable Wristband for 14 Workout Movements. Processes, 13(1), 207. https://doi.org/10.3390/pr13010207