IMU Sensor-Based Worker Behavior Recognition and Construction of a Cyber–Physical System Environment
<p>Fabrication of the prototype IMU sensor: (<b>a</b>) appearance of the fabricated IMU sensor; (<b>b</b>) diagram of the IMU sensor and the positions of the included sensors.</p> "> Figure 2
<p>Algorithm flowchart for worker status assessment.</p> "> Figure 3
<p>Results of applying the forward gaze algorithm: (<b>a</b>) when only the head turns; (<b>b</b>) when both body and head turn together (during directional change).</p> "> Figure 3 Cont.
<p>Results of applying the forward gaze algorithm: (<b>a</b>) when only the head turns; (<b>b</b>) when both body and head turn together (during directional change).</p> "> Figure 4
<p>Results of applying the Kalman filter and walking detection: (<b>a</b>) result of applying the Kalman filter to raw data; (<b>b</b>) walking detection results (1 when there is no foot movement; 0 when there is foot movement).</p> "> Figure 4 Cont.
<p>Results of applying the Kalman filter and walking detection: (<b>a</b>) result of applying the Kalman filter to raw data; (<b>b</b>) walking detection results (1 when there is no foot movement; 0 when there is foot movement).</p> "> Figure 5
<p>Results of applying the energy detection algorithm during jumping.</p> "> Figure 6
<p>Results of applying the extended Kalman filter to barometric data and estimation of altitude changes.</p> "> Figure 7
<p>IMU sensor placement by body part.</p> "> Figure 8
<p>IMU sensor application integration screen and database connection screen.</p> "> Figure 9
<p>Dashboard and environment setup for CPS implementation: (<b>a</b>) CPS dashboard screen layout; (<b>b</b>) CPS environment construction scene.</p> "> Figure 9 Cont.
<p>Dashboard and environment setup for CPS implementation: (<b>a</b>) CPS dashboard screen layout; (<b>b</b>) CPS environment construction scene.</p> "> Figure 10
<p>Worker icons and UI in CPS operation.</p> "> Figure 11
<p>Example of CPS application for multiple workers.</p> ">
Abstract
:1. Introduction
2. Materials and Methods
2.1. Development of Sensors with Enhanced Field Applicability
- Accelerometer: by measuring acceleration, the sensor can assess information such as an object’s tilt (inclination angle) and vibrations. Sudden changes in acceleration values are considered abnormal signals, used to monitor for events like worker falls.
- Gyroscope: this sensor measures the rotational angle per unit time, allowing for detection of changes in orientation and movement.
- PPG sensor: by measuring a worker’s heart rate and oxygen saturation, this sensor aids in identifying critical health risk factors.
- Body temperature sensor: by continuously measuring the worker’s body temperature in real time, this sensor detects sudden changes that may indicate illnesses such as heatstroke or hypothermia.
- Barometric pressure sensor: this sensor measures atmospheric pressure, which helps determine the conditions at the worker’s job site and identify associated risk factors.
- Acoustic sensor: by detecting changes in sound levels during specialized tasks (e.g., welding, cutting), this sensor monitors the worker’s activity duration.
2.2. Worker Behavior Analysis and Algorithm Development
2.2.1. Forward Attention Algorithm
- Prediction step: predict the next state based on the current state vector and the error covariance matrix.
- Update step: correct the state vector using the measured Roll and Pitch values, and reduce errors by utilizing the measurement noise covariance (R).
Algorithm 1. Detect forward attention events. |
Input: body_yaw, head_yaw, body_acc, head_acc, timestamps Output: Forward attention events with timestamps 1. Set parameters threshold = 10 # Yaw angle threshold for peak detection distance = 45 # Minimum sample distance between peaks window_size = 150 # NMNI filter window size ars = 0.13 # Angular rate sensitivity for NMNI filter 2. Apply NMNI filter to Yaw data body_yaw_filtered = apply_nmni(body_yaw, window_size, ars) head_yaw_filtered = apply_nmni(head_yaw, window_size, ars) 3. Initialize Kalman filter for orientation estimation kf = KalmanFilter(Q_acc=0.001, Q_bias=0.03, R=0.01) 4. Estimate orientation with Kalman filterfor each timestamp t: roll, pitch, yaw = estimate_orientation( kf, body_acc[t], head_acc[t], body_yaw_filtered[t], head_yaw_filtered[t], dt) 5. Detect peaks in filtered Yaw data body_peaks = find_peaks(body_yaw_filtered, height=threshold, distance=distance) head_peaks = find_peaks(head_yaw_filtered, height=threshold, distance=distance) 6. Classify events: for each peak in body_peaks: if overlap with head_peaks: classify as Both Turned else: classify as Head Turned 7. Merge overlapping Both Turned events: for each consecutive Both Turned event: if overlap: merge events 8. Output results: for each event: record start_time, end_time from timestamps store event type (“Both Turned” or “Head Turned”) |
2.2.2. Gait Detection Algorithm
Algorithm 2. Gait detection using Kalman filter and standard deviation. |
Input: Accelerometer (acc_x, acc_y, acc_z) and gyroscope (gyro_x, gyro_y, gyro_z) data Output: Stance-phase detection based on filtered acceleration and gyro data 1. Initialize Kalman filter kalman_filter_left = KalmanFilter() kalman_filter_right = KalmanFilter() 2. Filter and calibrate data using Kalman filter for each data point in acc and gyro data: z = [acc_x, acc_y, acc_z, gyro_x, gyro_y, gyro_z] # Measurement vector kalman_filter.predict() # Prediction step kalman_filter.update(z) # Update step save filtered_acc and filtered_gyro # Store filtered accelerometer and gyroscope data 3. Calculate standard deviation with sliding window acc_magnitude = sqrt(filtered_acc_x^2 + filtered_acc_y^2 + filtered_acc_z^2) sigma_a = rolling_std(acc_magnitude, window=window_size) sigma_omega = rolling_std(filtered_gyro_y, window=window_size) 4. Detect stance phase for each sample: if sigma_a < 0.1 and sigma_omega < 20: classify as Stance Phase else: classify as Non-Stance Phase 5. Analyze stance-phase intervalsfor each classified phase: if Stance Phase: count consecutive samples as Stance Interval else: count consecutive samples as Non-Stance Interval 6. Output results - Print Stance Intervals (number of samples in each stance phase) - Print Non-Stance Intervals (number of samples in each non-stance phase) |
2.2.3. Energy Detection Algorithm
Algorithm 3. Accelerometer energy detection algorithm. |
Input: acc_x, acc_y, acc_z data Output: Energy values for accelerometer data 1. Calculate RMS rms_acc = sqrt(acc_x^2 + acc_y^2 + acc_z^2) 2. Remove DC component rms_acc_detrended = rms_acc–moving_average(rms_acc, window=30) 3. Calculate energy energy_acc = rolling_mean(rms_acc_detrended^2, window=30) |
2.2.4. Altitude Detection Algorithm
Algorithm 4. EKF for Altitude Estimation. |
Input: Barometer (hPa) data Output: Altitude estimates over time 1. EKF-based altitude estimation (single state) Initialize altitude state x and covariance p For each barometer data point: - Predict next state: x and p - Update altitude from barometer: z_barometer = [hPa] - Compute Kalman Gain K and update x, p 2. Calculate altitude change Convert pressure to altitude (cm) Slide a 300-sample window to compute altitude change: altitude_change = altitude_end–altitude_start |
3. Results
4. Discussion
5. Conclusions
- The IMU sensors used in this research address challenges associated with existing commercial sensors—such as large form factors and difficulties in internal filtering, communication, and server integration—when implementing a CPS.
- To minimize worker discomfort and enable seamless attachment to personal protective equipment, the sensors were miniaturized and designed for placement on the head, body, hands, and legs. This approach allows for more granular measurement of diverse work activities.
- The IMU sensors were operated at a relatively low sampling rate of 15 times per second to extend their operating time during work hours, while an algorithm was designed to effectively capture workers’ movements in on-site conditions. In addition, the algorithm retained the flexibility to operate at higher frequencies (e.g., above 50 Hz) for more detailed motion analysis when needed.
- By attaching sensors to the head, body, both hands, and both legs, worker behaviors—such as walking, jumping, standing, sitting, working at height, and looking away from the forward direction—could be detected with approximately 90% accuracy. The IMU sensors also assessed workers’ health status (e.g., oxygen saturation, heart rate, and temperature) and transmitted these data to a database, which was then linked to the CPS interface, enabling managers to monitor workers in real time.
Author Contributions
Funding
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Occupational Safety and Health Administration. Commonly Used Statistics. Available online: https://www.osha.gov/data/commonstats (accessed on 21 December 2024).
- Health and Safety Executive. Work-Related Fatal Injuries in Great Britain. Available online: https://www.hse.gov.uk/statistics/fatals.htm (accessed on 21 December 2024).
- Zhong, R.; Rau, P.-L.P.; Yan, X. Gait Assessment of Younger and Older Adults with Portable Motion-Sensing Methods: A User Study. Mob. Inf. Syst. 2019, 2019, 1093514. [Google Scholar] [CrossRef]
- Chen, S.; Bangaru, S.S.; Yigit, T.; Trkov, M.; Wang, C.; Yi, J. Real-Time Walking Gait Estimation for Construction Workers Using a Single Wearable Inertial Measurement Unit (IMU). In Proceedings of the 2021 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM), Delft, The Netherlands, 12–16 July 2021. [Google Scholar] [CrossRef]
- Yodpijit, N.; Tavichaiyuth, N.; Jongprasithporn, M.; Songwongamarit, C.; Sittiwanchai, T. The use of smartphone for gait analysis. IEEE Sens. J. 2017, 20, 1191–1201. [Google Scholar]
- Awais, M.; Chiari, L.; Ihlen, E.A.F.; Helbostad, J.L.; Palmerini, L. Physical Activity Classification for Elderly People in Free-Living Conditions. IEEE J. Biomed. Health Inform. 2018, 23, 197–207. [Google Scholar] [CrossRef] [PubMed]
- Li, H.; Shrestha, A.; Heidari, H.; Kernec, J.L.; Fioranelli, F. Bi-LSTM Network for Multimodal Continuous Human Activity Recognition and Fall Detection. IEEE Sens. J. 2020, 20, 1191–1201. [Google Scholar] [CrossRef]
- Lee, Y.; Do, W.; Yoon, H.; Heo, J.; Lee, W.; Lee, D. Visual-Inertial Hand Motion Tracking with Robustness Against Occlusion, Interference, and Contact. Sci. Robot. 2021, 6, eabe1315. [Google Scholar] [CrossRef] [PubMed]
- Nia, N.G.; Kaplanoglu, E.; Nasab, A.; Qin, H. Human Activity Recognition Using Machine Learning Algorithms Based on IMU Data. In Proceedings of the 2023 5th International Conference on Bio-Engineering for Smart Technologies, Paris, France, 7–9 June 2023; pp. 7–9. [Google Scholar]
- Singh, A.; Rehman, S.U.; Yongcharoen, S.; Chong, P.H.J. Sensor technologies for fall detection systems: A review. IEEE Sens. J. 2020, 20, 6889–6919. [Google Scholar] [CrossRef]
- Shibuya, N.; Nukala, B.T.; Rodriguez, A.I.; Tsay, J.; Nguyen, T.Q.; Zupancic, S. A real-time fall detection system using a wearable gait analysis sensor and a Support Vector Machine (SVM) classifier. In Proceedings of the 2015 Eighth International Conference on Mobile Computing and Ubiquitous Networking (ICMU), Hakodate, Japan, 20–22 January 2015; pp. 66–67. [Google Scholar]
- Choi, A.; Kim, T.H.; Yuhai, O.; Jeong, S.; Kim, K.; Kim, H.; Mun, J.W. Deep Learning-Based Near-Fall Detection Algorithm for Fall Risk Monitoring System Using a Single Inertial Measurement Unit. IEEE Trans. Neural Syst. Rehabil. Eng. 2022, 30, 2385–2394. [Google Scholar] [CrossRef] [PubMed]
- Semwal, V.B.; Kumar, A.; Nargesh, P.; Soni, V. Tracking of Fall Detection Using IMU Sensor: An IoHT Application. In Machine Learning, Image Processing, Network Security and Data Sciences: Select Proceedings of 3rd International Conference on MIND 2021; Springer: Singapore, 2023. [Google Scholar]
- Choo, H.; Lee, B.; Kim, H.; Choi, B. Automated Detection of Construction Work at Heights and Deployment of Safety Hooks Using IMU with a Barometer. Autom. Constr. 2023, 147, 104714. [Google Scholar] [CrossRef]
- Rashidi, A.; Woon, G.L.; Dasandara, M.; Bazghaleh, M.; Pasbakhsh, P. Smart personal protective equipment for intelligent construction safety monitoring. In Smart and Sustainable Built Environment; Emerald: Bradford, UK, 2024. [Google Scholar]
- Ojha, A.; Shakerian, S.; Habibnezhad, M.; Jebelli, H. Feasibility Verification of Multimodal Wearable Sensing System for Holistic Health Monitoring of Construction Workers. In Proceedings of the Canadian Society of Civil Engineering Annual Conference 2021 (CSCE 2021), Niagara Falls, ON, Canada, 26–29 May 2021; pp. 283–294. [Google Scholar]
- Samatas, G.G.; Pachidis, T.P. Inertial Measurement Units (IMUs) in Mobile Robots over the Last Five Years: A Review. Designs 2022, 6, 17. [Google Scholar] [CrossRef]
- Hong, S.; Yoon, J.; Ham, Y.; Lee, B.; Kim, H. Monitoring Safety Behaviors of Scaffolding Workers Using Gramian Angular Field Convolution Neural Network Based on IMU Sensing Data. Autom. Constr. 2023, 148, 104748. [Google Scholar] [CrossRef]
- Hoang, M.L.; Carratù, M.; Paciello, V.; Pietrosanto, A. Fusion Filters between the No Motion No Integration Technique and Kalman Filter in Noise Optimization on a 6DoF Drone for Orientation Tracking. Sensors 2023, 23, 5603. [Google Scholar] [CrossRef] [PubMed]
- Shi, L.-F.; Dong, Y.-J.; Shi, Y. Indoor PDR Method Based on Foot-Mounted Low-Cost IMMU. In Proceedings of the 2022 IEEE International Conference on Networking, Sensing and Control (ICNSC), Shanghai, China, 15–18 December 2022; IEEE: Piscataway, NJ, USA, 2023. [Google Scholar]
- Son, Y.; Oh, S. A Barometer-IMU Fusion Method for Vertical Velocity and Height Estimation. In Proceedings of the 2015 IEEE SENSORS, Busan, Republic of Korea, 1–4 November 2015; IEEE: Piscataway, NJ, USA, 2016. [Google Scholar]
Action | Total Trials | Correctly Classified | Accuracy (%) |
---|---|---|---|
Walking | 50 | 50 | 100 |
Jumping | 50 | 47 | 94 |
Standing | 50 | 50 | 100 |
High-altitude work | 50 | 45 | 90 |
Not looking forward | 50 | 43 | 86 |
Sitting | 50 | 40 | 80 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Park, S.; Youm, M.; Kim, J. IMU Sensor-Based Worker Behavior Recognition and Construction of a Cyber–Physical System Environment. Sensors 2025, 25, 442. https://doi.org/10.3390/s25020442
Park S, Youm M, Kim J. IMU Sensor-Based Worker Behavior Recognition and Construction of a Cyber–Physical System Environment. Sensors. 2025; 25(2):442. https://doi.org/10.3390/s25020442
Chicago/Turabian StylePark, Sehwan, Minkyo Youm, and Junkyeong Kim. 2025. "IMU Sensor-Based Worker Behavior Recognition and Construction of a Cyber–Physical System Environment" Sensors 25, no. 2: 442. https://doi.org/10.3390/s25020442
APA StylePark, S., Youm, M., & Kim, J. (2025). IMU Sensor-Based Worker Behavior Recognition and Construction of a Cyber–Physical System Environment. Sensors, 25(2), 442. https://doi.org/10.3390/s25020442