Human Activity Recognition Using Binary Sensors, BLE Beacons, an Intelligent Floor and Acceleration Data: A Machine Learning Approach †
Abstract
:1. Introduction
2. Methods
2.1. Business Understanding and Data Understanding
- Event streams of binary sensors: 30 binary sensors were located in different parts of the SmartLab. They send a binary value such as Open-Close, Movement- No movement and Pressure-No Pressure with its respective Timestamp.
- Spatial data from an intelligent floor: Capacitance data of each of the Smart Lab’s smart floor modules with their respective Timestamp
- Proximity data between a smart watch worn by the inhabitant and Bluetooth Low Energy (BLE) beacons: 15 BLE beacons were located in different parts of the SmartLab. Their RSSI was collected at a sampling frequency of 0.25 Hz
- Acceleration data from the same smart watch worn by the inhabitant: 3D acceleration collected with a sample frequency of 50 Hz.
2.2. Data Preparation
- Binary sensors: 30 binary features (one for each sensor). If the status of a sensor is in Open, Movement or Pressure in any sample belonging to the segment, its corresponding feature has the value “1”, otherwise it will be assigned to “0”.
- Intelligent floor: 40 binary features (one for each module). In the case that the capacitance of a module is greater than zero in any of the samples of the segment, its corresponding feature is “1”, otherwise it is “0”.
- Proximity data: 4 categorical features that correspond to the ID of the nearest four BLE beacons.
- Acceleration: 13 statistical features: mean, median, standard deviation and mean absolute deviation for each axis and the mean of the square roots of the sum of the values of each axis squared.
2.3. Modeling
2.4. Evaluation
3. Results
4. Discussion and Conclusions
Funding
Conflicts of Interest
Appendix A
Object | |||
---|---|---|---|
Door | Kettle | Trash | Cupboard cups |
TV | Medication box | Tap | Dishwasher |
Sensor Kitchen movement | Fruit platter | Tank | Top WC |
Motion sensor bathroom | Cutlery | Laundry basket | Closet |
Motion sensor bedroom | Pots | Pyjamas drawer | Washing machine |
Motion sensor sofa | Water bottle | Bed | Pantry |
Refrigerator | Remote XBOX | Kitchen faucet | |
Microwave | Pressure sofa | Wardrobe clothes |
Object | ||
---|---|---|
TV controller | Fridge | Pyjama drawer |
Book | Pot drawer | Bed |
Entrance door | Water bottle | Bathroom tap |
Medicine box | Garbage can | Toothbrush |
Food cupboard | Wardrobe door | Laundry basket |
ID Activity | Name | ID Activity | Name |
---|---|---|---|
1 | Take medication | 13 | Leave the SmartLab |
2 | Prepare breakfast | 14 | Visit in the SmartLab |
3 | Prepare Lunch | 15 | Put waste in the bin |
4 | Prepare dinner | 16 | Wash hands |
5 | Breakfast | 17 | Brush teeth |
6 | Lunch | 18 | Use the toilet |
7 | Dinner | 19 | Wash dishes |
8 | Eat a snack | 20 | Put washing into the washing machine |
9 | Watch TV 494 | 21 | Work at table |
10 | Enter to the Smartlab | 22 | Dressing |
11 | Play a video game 78 | 23 | Go to the bed |
12 | Relax on the sofa 207 | 24 | Wake up |
Appendix B
Idle | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 | 19 | 20 | 21 | 22 | 23 | 24 | |
Idle | 5 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 1 | 8 | 0 | 0 | 1 | 0 | 0 | 0 | 3 | 0 | 0 | 0 | 1 | 1 | 1 | 3 |
1 | 1 | 9 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 |
2 | 0 | 0 | 9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
3 | 0 | 0 | 0 | 41 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
4 | 0 | 0 | 0 | 0 | 18 | 0 | 0 | 0 | 0 | 0 | 4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
5 | 0 | 0 | 3 | 0 | 0 | 31 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
6 | 0 | 1 | 0 | 1 | 0 | 0 | 23 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
7 | 0 | 1 | 0 | 0 | 3 | 0 | 0 | 36 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 16 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
10 | 2 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
11 | 8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 |
12 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 76 | 0 | 17 | 0 | 0 | 0 | 0 | 0 | 5 | 0 | 0 | 0 | 0 | 0 | 2 | 0 |
13 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 7 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 | 0 | 0 |
14 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 |
15 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
16 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
17 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 6 | 0 | 19 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
18 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 4 | 0 | 1 | 0 | 0 | 0 | 0 |
19 | 0 | 3 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
20 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
21 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 35 | 0 | 0 | 0 |
22 | 4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 21 | 0 | 0 |
23 | 8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 3 | 0 |
24 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 17 |
References
- Hardegger, M.; Nguyen-Dinh, L.-V.; Calatroni, A.; Tröster, G.; Roggen, D. Enhancing action recognition through simultaneous semantic mapping from body-worn motion sensors. In Proceedings of the 2014 ACM International Symposium on Wearable Computers (ISWC ’14), Seattle, WA, USA, 13–17 September 2014; pp. 99–106. [Google Scholar] [CrossRef]
- Loveday, A.; Sherar, L.B.; Sanders, J.P.; Sanderson, P.W.; Esliger, D.W. Technologies that assess the location of physical activity and sedentary behavior: A systematic review. J. Med. Internet Res. 2015, 17. [Google Scholar] [CrossRef] [PubMed]
- Ceron, J.D.; Lopez, D.M. Human Activity Recognition Supported on Indoor Localization and viceversa: A. Systematic Review. Stud. Health Technol. Inf. 2018, 249, 93–101. [Google Scholar]
- Hardegger, M.; Roggen, D.; Calatroni, A.; Tröster, G. S-SMART: A Unified Bayesian Framework for Simultaneous Semantic Mapping, Activity Recognition, and Tracking. ACM Trans. Intell. Syst. Technol. 2016, 7, 1–28. [Google Scholar] [CrossRef]
- Possos, W.; Cruz, R.; Cerón, J.D.; López, D.M.; Sierra-Torres, C.H. Open dataset for the automatic recognition of sedentary behaviors. Stud. Health Technol. Inf. 2017. [Google Scholar] [CrossRef]
- UJAmI Smart Lab Repository. Available online: http://ceatic.ujaen.es/ujami/en/repository (accessed on 15 September 2018).
- Chapman, P.; Clinton, J.; Kerber, R.; Khabaza, T.; Reinartz, T.; Shearer, C. CRISP-DM 1.0 Step-by-Step Data Mining Guide 2000. Available online: http://www.citeulike.org/group/1598/article/1025172 (accessed on 15 September 2018).
- Ceron, J.D.; Lopez, D.M.; Ramirez, G.A. A mobile system for sedentary behaviors classification based on accelerometer and location data. Comput. Ind. 2017. [Google Scholar] [CrossRef]
- Hall, M.; Frank, E.; Holmes, G.; Pfahringer, B.; Reutemann, P.; Witten, I. The WEKA data mining software: An update. SIGKDD Explor. 2009, 11, 10–18. [Google Scholar] [CrossRef]
Algorithm | J48 | Ib1 | SVM | RF | ABM1 | Bagging |
---|---|---|---|---|---|---|
Accuracy (%) | 88 | 91.2 | 89.4 | 90.3 | 92.1 | 89.8 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Cerón, J.D.; López, D.M.; Eskofier, B.M. Human Activity Recognition Using Binary Sensors, BLE Beacons, an Intelligent Floor and Acceleration Data: A Machine Learning Approach. Proceedings 2018, 2, 1265. https://doi.org/10.3390/proceedings2191265
Cerón JD, López DM, Eskofier BM. Human Activity Recognition Using Binary Sensors, BLE Beacons, an Intelligent Floor and Acceleration Data: A Machine Learning Approach. Proceedings. 2018; 2(19):1265. https://doi.org/10.3390/proceedings2191265
Chicago/Turabian StyleCerón, Jesús D., Diego M. López, and Bjoern M. Eskofier. 2018. "Human Activity Recognition Using Binary Sensors, BLE Beacons, an Intelligent Floor and Acceleration Data: A Machine Learning Approach" Proceedings 2, no. 19: 1265. https://doi.org/10.3390/proceedings2191265
APA StyleCerón, J. D., López, D. M., & Eskofier, B. M. (2018). Human Activity Recognition Using Binary Sensors, BLE Beacons, an Intelligent Floor and Acceleration Data: A Machine Learning Approach. Proceedings, 2(19), 1265. https://doi.org/10.3390/proceedings2191265