Hardware for Recognition of Human Activities: A Review of Smart Home and AAL Related Technologies
<p>Some terms selected in network visualization of the bibliometric analysis generated in VOSviewer software.</p> "> Figure 2
<p>Biggest bibliometric network visualization of mixing papers retrieved from the World of Science (WoS) around the terms smart home, smart environment, activity recognition, and ambient assisted living.</p> "> Figure 3
<p>Diagram of the review method conducted based on PRISMA and operative structure in [<a href="#B28-sensors-20-04227" class="html-bibr">28</a>].</p> "> Figure 4
<p>The trend in numbers of publications from papers initially selected for all years included in the database.</p> "> Figure 5
<p>Worldwide map with an overview of the concentration and distribution of selected works.</p> "> Figure 6
<p>Overview of the journal distribution of selected papers.</p> "> Figure 7
<p>WoS research area distribution of the selected works.</p> "> Figure 8
<p>Analysis of hardware technology distribution. (<b>a</b>) Percentage of uses along the works reviewed. (<b>b</b>) Based on self-developed or commercial device-based solutions.</p> "> Figure 9
<p>Identified categories of hardware technology used for activity recognition in smart home and ambient assisted living and its contributions.</p> "> Figure 10
<p>Assistive robots identified in activity recognition research: (<b>a</b>) PR2 robot [<a href="#B59-sensors-20-04227" class="html-bibr">59</a>]; (<b>b</b>) Pepper robot [<a href="#B60-sensors-20-04227" class="html-bibr">60</a>]; (<b>c</b>) Care-O-bot3 [<a href="#B61-sensors-20-04227" class="html-bibr">61</a>].</p> "> Figure 11
<p>Distribution of activity recognition application in smart home and AAL and its relationship with target populations.</p> "> Figure 12
<p>Relationships between technology (square) and research focus (circle) for activity recognition.</p> "> Figure 13
<p>Relationship network analysis for hardware solutions deployed in activity recognition for smart home and AAL. Technology (orange square), a particular type of technology (green circle), itemized sensors (pink circle), and other specific devices (blue circle).</p> ">
Abstract
:1. Introduction
2. Review Method
2.1. Query String Construction
- FQ1: (AAL query) × (AR query)
- FQ2: (SH query) × (AR query)
2.2. Gathering Potential Results
2.3. Including and EXCLUDING Results
- Proposal schemes and approaches, simulated scenarios or datasets, use of open or popular or well-known datasets, without proved experiment.
- Proposals of methodologies, approaches, frameworks or related that do not mention explicit testbeds, prototypes, or experimentation with hardware.
- Home automation applications, brain or gait activity recognition, health variables, or proposals to improve systems or cognitive activity.
- The paper used hardware to acquire information for AR in the same research work.
- Datasets in the research work generated in the same experiment were used.
- Commercial technology, self-built devices, or developed prototypes were used.
- Tested approaches with self-built datasets using virtual or physical sensors on smartphones, smartwatches, smart bands, etc.
- There was a focus on testing and using hardware, acquired, or self-developed as part of the research.
2.4. Characterization of the Selected Literature
3. Results
3.1. Wearables
3.2. Smartphones
3.3. Video
3.4. Electronic Components
3.5. Wi-Fi
3.6. Assistive Robotics
4. Analysis and Discussion
- Video technology can help in mobility and localization by using wearables as a way of alerting.
- Due to the prominent Wi-Fi results, research should extend to occupancy detection, fall detection, and posture for care.
- Assistive robots with wearables, smartphones, and electronic components can be used for vital sign monitoring and alerts for remote care.
- Wearables can be used for occupancy applications and care of health conditions.
- How are large-scale house projects for activity recognition planned?
- Through technological surveillance, how can we extend our understanding of promising advances such as smart floors, smart beds, and smart walls?
- Which types of tested hardware technology are giving better results?
- How can researchers design testbeds? It is crucial to have an overview of how to design this type of experiment and increase the credibility for approval by scientific networks of new paper proposals.
- What is the cost-benefit relationship in achieving effectiveness in each focus of activity recognition?
- Which commercial technology gives the best effective results in activity recognition so that it can be taken to market?
5. Conclusions
Funding
Conflicts of Interest
References
- Bejarano, A. Towards the Evolution of Smart Home Environments: A Survey. Int. J. Autom. Smart Technol. 2016, 6, 105–136. [Google Scholar]
- Mantoro, T.; Ayu, M.A.; Elnour, E.E. Web-enabled smart home using wireless node infrastructure. In Proceedings of the MoMM ’11: 9th International Conference on Advances in Mobile Computing and Multimedia, Ho Chi Minh City, Vietnam, 5–7 December 2011; pp. 72–79. [Google Scholar] [CrossRef]
- Qu, T.; Bin, S.; Huang, G.Q.; Yang, H.D. Two-stage product platform development for mass customization. Int. J. Prod. Res. 2011, 49, 2197–2219. [Google Scholar] [CrossRef]
- Kim, C.G.; Kim, K.J. Implementation of a cost-effective home lighting control system on embedded Linux with OpenWrt. Pers. Ubiquitous Comput. 2014, 18, 535–542. [Google Scholar] [CrossRef]
- Chen, L.; Hoey, J.; Nugent, C.D.; Cook, D.J.; Yu, Z. Sensor-based activity recognition. IEEE Trans. Syst. Man Cybern. Part C Appl. Rev. 2012, 42, 790–808. [Google Scholar] [CrossRef]
- Kumari, P.; Mathew, L.; Syal, P. Increasing trend of wearables and multimodal interface for human activity monitoring: A review. Biosens. Bioelectron. 2017, 90, 298–307. [Google Scholar] [CrossRef] [PubMed]
- Younes, R.; Jones, M.; Martin, T.L. Classifier for activities with variations. Sensors 2018, 18, 3529. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Liangying, P.; Chen, L.; Wu, X.; Guo, H.; Chen, G. Hierarchical Complex Activity Representation and Recognition Using Topic Model and Classifier Level Fusion. IEEE Trans. Biomed. Eng. 2017, 64, 1369–1379. [Google Scholar]
- Amiribesheli, M.; Benmansour, A.; Bouchachia, A. A review of smart homes in healthcare. J. Ambient Intell. Humaniz. Comput. 2015, 6, 495–517. [Google Scholar] [CrossRef] [Green Version]
- Bang, J.; Hur, T.; Kim, D.; Huynh-The, T.; Lee, J.; Han, Y.; Banos, O.; Kim, J.I.; Lee, S. Adaptive data boosting technique for robust personalized speech emotion in emotionally-imbalanced small-sample environments. Sensors (Switzerland) 2018, 18, 3744. [Google Scholar] [CrossRef] [Green Version]
- Weiser, M. The computer for the 21st century. Sci. Am. 1991, 265, 94–105. [Google Scholar] [CrossRef]
- Malkani, Y.A.; Memon, W.A.; Dhomeja, L.D. A Low-cost Activity Recognition System for Smart Homes. In Proceedings of the 2018 IEEE 5th International Conference on Engineering Technologies and Applied Sciences (ICETAS), Bangkok, Thailand, 22–23 November 2019; pp. 1–7. [Google Scholar]
- Acampora, G. A Survey on Ambient Intelligence in Health Care. Proc. IEEE 2012, 40, 1301–1315. [Google Scholar]
- De-La-Hoz-Franco, E.; Ariza-Colpas, P.; Quero, J.M.; Espinilla, M. Sensor-based datasets for human activity recognition—A systematic review of literature. IEEE Access 2018, 6, 59192–59210. [Google Scholar] [CrossRef]
- Espinilla, M.; Medina, J.; Calzada, A.; Liu, J.; Martínez, L.; Nugent, C. Optimizing the configuration of an heterogeneous architecture of sensors for activity recognition, using the extended belief rule-based inference methodology. Microprocess. Microsyst. 2017, 52, 381–390. [Google Scholar] [CrossRef] [Green Version]
- Mehedi, M.; Uddin, Z.; Mohamed, A.; Almogren, A. A robust human activity recognition system using smartphone sensors and deep learning. Future Gener. Comput. Syst. 2018, 81, 307–313. [Google Scholar]
- Kötteritzsch, A.; Weyers, B. Assistive Technologies for Older Adults in Urban Areas: A Literature Review. Cognit. Comput. 2016, 8, 299–317. [Google Scholar] [CrossRef]
- Ni, Q.; Hernando, A.B.G.; de la Cruz, I.P. The Elderly’s Independent Living in Smart Homes: A Characterization of Activities and Sensing Infrastructure Survey to Facilitate Services Development. Sensors 2015, 15, 11312–11362. [Google Scholar] [CrossRef]
- Peetoom, K.K.B.; Lexis, M.A.S.; Joore, M.; Dirksen, C.D.; De Witte, L.P. Literature review on monitoring technologies and their outcomes in independently living elderly people. Disabil. Rehabil. Assist. Technol. 2015, 10, 271–294. [Google Scholar] [CrossRef]
- Johansson, F. Medici Effect: What Elephants and Epidemics can Teach Us about Innovation; Harvard Business School Press: Boston, MA, USA, 2020; ISBN 978163362947. [Google Scholar]
- Moher, D. Preferred Reporting Items for Systematic Reviews and Meta-Analyses: The PRISMA Statement. Ann. Intern. Med. 2013, 151, 264. [Google Scholar] [CrossRef] [Green Version]
- Sanchez, A.; Neira, D.; Cabello, J. Frameworks applied in Quality Management—A Systematic Review. Rev. Espac. 2016, 37, 17. [Google Scholar]
- Van Eck, N.J.; Waltman, L. Visualizing Bibliometric Networks. In Measuring Scholarly Impact; Springer: Cham, Switzerland, 2014; ISBN 9783319103778. [Google Scholar]
- Wang, Q.; Waltman, L. Large-scale analysis of the accuracy of the journal classification systems of Web of Science and Scopus. J. Informetr. 2016, 10, 347–364. [Google Scholar] [CrossRef] [Green Version]
- Franceschini, F.; Maisano, D.; Mastrogiacomo, L. Customer requirement prioritization on QFD: A new proposal based on the generalized Yager’s algorithm. Res. Eng. Des. 2015, 26, 171–187. [Google Scholar] [CrossRef]
- AlRyalat, S.A.S.; Malkawi, L.W.; Momani, S.M. Comparing Bibliometric Analysis Using PubMed, Scopus, and Web of Science Databases. J. Vis. Exp. 2019, 152, e58494. [Google Scholar] [CrossRef] [PubMed]
- Perianes-Rodriguez, A.; Waltman, L.; van Eck, N.J. Constructing bibliometric networks: A comparison between full and fractional counting. J. Informetr. 2016, 10, 1178–1195. [Google Scholar] [CrossRef] [Green Version]
- Sanchez-Comas, A.; Neira, D.; Cabello, J.J. Marcos aplicados a la Gestión de Calidad—Una Revisión Sistemática de la Literatura. Espacios 2016, 37, 17. [Google Scholar]
- Ota, H.; Chao, M.; Gao, Y.; Wu, E.; Tai, L.C.; Chen, K.; Matsuoka, Y.; Iwai, K.; Fahad, H.M.; Gao, W.; et al. 3D printed “earable” smart devices for real-time detection of core body temperature. ACS Sens. 2017, 2, 990–997. [Google Scholar] [CrossRef] [PubMed]
- Mendoza-Palechor, F.; Menezes, M.L.; Sant’Anna, A.; Ortiz-Barrios, M.; Samara, A.; Galway, L. Affective recognition from EEG signals: An integrated data-mining approach. J. Ambient Intell. Humaniz. Comput. 2019, 10, 3955–3974. [Google Scholar] [CrossRef]
- Bilbao, A.; Almeida, A.; López-de-ipiña, D. Promotion of active ageing combining sensor and social network data. J. Biomed. Inform. 2016, 64, 108–115. [Google Scholar] [CrossRef]
- Lee, J.S.; Choi, S.; Kwon, O. Identifying multiuser activity with overlapping acoustic data for mobile decision making in smart home environments. Expert Syst. Appl. 2017, 81, 299–308. [Google Scholar] [CrossRef]
- Damian, I.; Dietz, M.; Steinert, A.; André, E.; Haesner, M.; Schork, D. Automatic Detection of Visual Search for the Elderly using Eye and Head Tracking Data. KI Künstl. Intell. 2017, 31, 339–348. [Google Scholar]
- Zhang, S.; Mccullagh, P. Situation Awareness Inferred From Posture Transition and Location. IEEE Trans. Hum. Mach. Syst. 2017, 47, 814–821. [Google Scholar] [CrossRef]
- Rafferty, J.; Nugent, C.D.; Liu, J. From Activity Recognition to Intention Recognition for Assisted Living Within Smart Homes. IEEE Trans. Hum. Mach. Syst. 2017, 47, 368–379. [Google Scholar] [CrossRef] [Green Version]
- Amft, O.; Laerhoven, K. Wearable Section Applications Title Computing Here. IEEE Pervasive Comput. 2017, 19, 80–85. [Google Scholar] [CrossRef]
- Athavale, Y.; Krishnan, S. Biosignal monitoring using wearables: Observations and opportunities. Biomed. Signal. Process. Control. 2017, 38, 22–33. [Google Scholar] [CrossRef]
- Augustyniak, P.; Ślusarczyk, G. Graph-based representation of behavior in detection and prediction of daily living activities. Comput. Biol. Med. 2018, 95, 261–270. [Google Scholar] [CrossRef]
- Ni, Q.; Zhang, L.; Li, L. A Heterogeneous Ensemble Approach for Activity Recognition with Integration of Change Point-Based Data Segmentation. Appl. Sci. 2018, 8, 1695. [Google Scholar] [CrossRef] [Green Version]
- Ahmed, M.; Mehmood, N.; Nadeem, A.; Mehmood, A.; Rizwan, K. Fall Detection System for the Elderly Based on the Classification of Shimmer Sensor Prototype Data. Healthc. Inform. Res. 2017, 23, 147–158. [Google Scholar] [CrossRef] [Green Version]
- Clapés, A.; Pardo, À.; Pujol Vila, O.; Escalera, S. Action detection fusing multiple Kinects and a WIMU: An application to in-home assistive technology for the elderly. Mach. Vis. Appl. 2018, 29, 765–788. [Google Scholar] [CrossRef]
- Faye, S.; Bronzi, W.; Tahirou, I.; Engel, T. Characterizing user mobility using mobile sensing systems. Int. J. Distrib. Sens. Netw. 2017, 13, 1550147717726310. [Google Scholar] [CrossRef] [Green Version]
- Garcia-Ceja, E.; Galván-Tejada, C.E.; Brena, R. Multi-view stacking for activity recognition with sound and accelerometer data. Inf. Fusion 2018, 40, 45–56. [Google Scholar] [CrossRef]
- Kang, J.; Larkin, H. Application of an Emergency Alarm System for Physiological Sensors Utilizing Smart Devices. Technologies 2017, 5, 26. [Google Scholar] [CrossRef] [Green Version]
- Maglogiannis, I. Fall detection and activity identification using wearable and hand-held devices. Integr. Comput. Aided Eng. 2016, 23, 161–172. [Google Scholar] [CrossRef]
- Shewell, C.; Nugent, C.; Donnelly, M.; Wang, H.; Espinilla, M. Indoor localization through object detection within multiple environments utilizing a single wearable camera. Health Technol. 2017, 7, 51–60. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Hardegger, M.; Calatroni, A.; Oster, G.T.; Hardegger, M.; Calatroni, A.; Tröster, G.; Roggen, D. S-SMART: A Unified Bayesian Framework for Simultaneous Semantic Mapping, Activity Recognition, and Tracking. ACM Trans. Intell. Syst. Technol. Intell. Syst. Technol. Artic. 2016, 7, 1–28. [Google Scholar] [CrossRef]
- Ma, C.; Shimada, A.; Uchiyama, H.; Nagahara, H.; Taniguchi, R.I. Fall detection using optical level anonymous image sensing system. Opt. Laser Technol. 2019, 110, 44–61. [Google Scholar] [CrossRef]
- Withanage, K.I.; Lee, I.; Brinkworth, R.; Mackintosh, S.; Thewlis, D. Fall Recovery Subactivity Recognition with RGB-D Cameras. IEEE Trans. Ind. Inform. 2016, 12, 2312–2320. [Google Scholar] [CrossRef]
- Akula, A.; Shah, A.K.; Ghosh, R. ScienceDirect Deep learning approach for human action recognition in infrared images. Cogn. Syst. Res. 2018, 50, 146–154. [Google Scholar] [CrossRef]
- Ho, Y.O.C.; Ulier, S.I.J.J.; Arquardt, N.I.M.; Adia, N.; Erthouze, B.I. Robust tracking of respiratory rate in high- dynamic range scenes using mobile thermal imaging. Biomed. Opt. Express 2017, 8, 1565–1588. [Google Scholar]
- Fernández-Caballero, A.; Martínez-Rodrigo, A.; Pastor, J.M.; Castillo, J.C.; Lozano-Monasor, E.; López, M.T.; Zangróniz, R.; Latorre, J.M.; Fernández-Sotos, A. Smart environment architecture for emotion detection and regulation. J. Biomed. Inform. 2016, 64, 55–73. [Google Scholar] [CrossRef]
- Xu, Q.; Safar, Z.; Han, Y.; Wang, B.; Liu, K.J.R. Statistical Learning Over Time-Reversal Space for Indoor Monitoring System. IEEE Internet Things J. 2018, 5, 970–983. [Google Scholar] [CrossRef]
- Guo, L.; Wang, L.; Liu, J.; Zhou, W.; Lu, B. HuAc: Human Activity Recognition Using Crowdsourced WiFi Signals and Skeleton Data. Wirel. Commun. Mob. Comput. 2018, 2018, 6163475. [Google Scholar] [CrossRef]
- Savazzi, S.; Rampa, V. Leveraging MIMO-OFDM radio signals for device-free occupancy inference: System design and experiments. EURASIP J. Adv. Signal Process. 2018, 44, 1–19. [Google Scholar]
- Koppula, H.S. Anticipating Human Activities using Object Affordances for Reactive Robotic Response. Proc. IEEE Int. Conf. Comput. Vis. 2013, 38, 14–29. [Google Scholar]
- Saunders, J.; Syrdal, D.S.; Koay, K.L.; Burke, N.; Dautenhahn, K. ’Teach Me-Show Me’-End-User Personalization of a Smart Home and Companion Robot. IEEE Trans. Hum. Mach. Syst. 2016, 46, 27–40. [Google Scholar] [CrossRef] [Green Version]
- Costa, A.; Martinez-Martin, E.; Cazorla, M.; Julian, V. PHAROS—PHysical assistant RObot system. Sensors (Switzerland) 2018, 18, 2633. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- File: PR2 Robot with Advanced Grasping hands.JPG. Available online: https://commons.wikimedia.org/w/index.php?title=File:PR2_robot_with_advanced_grasping_hands.JPG (accessed on 17 December 2019).
- File: Pepper—France—Les Quatres Temps—Darty—2016-11-04.jpg. Available online: https://commons.wikimedia.org/w/index.php?title=File:Pepper_-_France_-_Les_Quatres_Temps_-_Darty_-_2016-11-04.jpg (accessed on 17 December 2019).
- File: Care-O-Bot Grasping an Object on the Table (5117071459).jpg. Available online: https://commons.wikimedia.org/w/index.php?title=File:Care-O-Bot_grasping_an_object_on_the_table_(5117071459).jpg (accessed on 17 December 2019).
- Villeneuve, E.; Harwin, W.; Holderbaum, W.; Janko, B.; Sherratt, R.S. Special Section On Advances Of Multisensory Services And Reconstruction of Angular Kinematics From Wrist-Worn Inertial Sensor Data for Smart Home Healthcare. IEEE Access 2017, 5, 2351–2363. [Google Scholar] [CrossRef]
- Zhang, Z.; Song, Y.; Cui, L.; Liu, X. Emotion recognition based on customized smart bracelet with built-in accelerometer. PeerJ 2016, 4, e2258. [Google Scholar] [CrossRef] [Green Version]
- Activity, H.; Using, R. Hierarchical Activity Recognition Using Smart Watches and RGB-Depth Cameras. Sensors 2016, 16, 1713. [Google Scholar]
- Biagetti, G.; Crippa, P.; Falaschetti, L.; Turchetti, C. Classifier level fusion of accelerometer and sEMG signals for automatic fitness activity diarization. Sensors (Switzerland) 2018, 18, 2850. [Google Scholar] [CrossRef] [Green Version]
- Orcioni, S.; Turchetti, C.; Falaschetti, L.; Crippa, P.; Biagetti, G. Human activity monitoring system based on wearable sEMG and accelerometer wireless sensor nodes. Biomed. Eng. Online 2018, 17, 132. [Google Scholar]
- Wang, P.; Sun, L.; Yang, S.; Smeaton, A.F.; Gurrin, C. Characterizing everyday activities from visual lifelogs based on enhancing concept representation. Comput. Vis. Image Underst. 2016, 148, 181–192. [Google Scholar] [CrossRef] [Green Version]
- Mokhtari, G.; Zhang, Q.; Nourbakhsh, G.; Ball, S.; Karunanithi, M.; Member, S. BLUESOUND: A New Resident Identification Sensor—Using Ultrasound Array and BLE Technology for Smart Home Platform. IEEE Sens. J. 2017, 17, 1503–1512. [Google Scholar] [CrossRef]
- Chen, Z. Robust Human Activity Recognition Using Smartphone Sensors via CT-PCA. IEEE Trans. Ind. Inform. 2017, 13, 3070–3080. [Google Scholar] [CrossRef]
- Khan, M.A.A.H.; Roy, N.; Hossain, H.M.S. Wearable Sensor-Based Location-Specific Occupancy Detection in Smart Environments. Mob. Inf. Syst. 2018, 2018, 4570182. [Google Scholar] [CrossRef] [Green Version]
- Iwasawa, Y.; Eguchi Yairi, I.; Matsuo, Y. Combining human action sensing of wheelchair users and machine learning for autonomous accessibility data collection. IEICE Trans. Inf. Syst. 2016, E99D, 1153–1161. [Google Scholar] [CrossRef] [Green Version]
- Gupta, H.P.; Chudgar, H.S.; Mukherjee, S.; Dutta, T.; Sharma, K. A Continuous Hand Gestures Recognition Technique for Human-Machine Interaction Using Accelerometer and gyroscope sensors. IEEE Sens. J. 2016, 16, 6425–6432. [Google Scholar] [CrossRef]
- Saha, J.; Chowdhury, C.; Biswas, S. Two phase ensemble classifier for smartphone based human activity recognition independent of hardware configuration and usage behaviour. Microsyst. Technol. 2018, 24, 2737–2752. [Google Scholar] [CrossRef]
- Liu, Z.; Yin, J.; Li, J.; Wei, J.; Feng, Z. A new action recognition method by distinguishing ambiguous postures. Int. J. Adv. Robot. Syst. 2018, 15, 1729881417749482. [Google Scholar] [CrossRef] [Green Version]
- Yao, B.; Hagras, H.; Alghazzawi, D.; Member, S.; Alhaddad, M.J. A Big Bang—Big Crunch Type-2 Fuzzy Logic System for Machine-Vision-Based Event Detection and Summarization in Real-World Ambient-Assisted Living. IEEE Trans. Fuzzy Syst. 2016, 24, 1307–1319. [Google Scholar] [CrossRef] [Green Version]
- Trindade, P.; Langensiepen, C.; Lee, K.; Adama, D.A.; Lotfi, A. Human activity learning for assistive robotics using a classifier ensemble. Soft Comput. 2018, 22, 7027–7039. [Google Scholar]
- Wang, S.; Chen, L.; Zhou, Z.; Sun, X.; Dong, J. Human fall detection in surveillance video based on PCANet. Multimed. Tools Appl. 2016, 75, 11603–11613. [Google Scholar] [CrossRef]
- Eldib, M.; Deboeverie, F.; Philips, W.; Aghajan, H. Behavior analysis for elderly care using a network of low-resolution visual sensors. J. Electron. Imaging 2016, 25, 041003. [Google Scholar] [CrossRef] [Green Version]
- Wickramasinghe, A.; Shinmoto Torres, R.L.; Ranasinghe, D.C. Recognition of falls using dense sensing in an ambient assisted living environment. Pervasive Mob. Comput. 2017, 34, 14–24. [Google Scholar] [CrossRef]
- Chen, Z.; Wang, Y. Infrared–ultrasonic sensor fusion for support vector machine–based fall detection. J. Intell. Mater. Syst. Struct. 2018, 29, 2027–2039. [Google Scholar] [CrossRef]
- Chen, Z.; Wang, Y.; Liu, H. Unobtrusive Sensor based Occupancy Facing Direction Detection and Tracking using Advanced Machine Learning Algorithms. IEEE Sens. J. 2018, 18, 6360–6368. [Google Scholar] [CrossRef]
- Wang, J.; Zhang, X.; Gao, Q.; Feng, X.; Wang, H. Device-Free Simultaneous Wireless Localization and Activity Recognition With Wavelet Feature. IEEE Trans. Veh. Technol. 2017, 66, 1659–1669. [Google Scholar] [CrossRef]
- Rus, S.; Grosse-Puppendahl, T.; Kuijper, A. Evaluating the recognition of bed postures using mutual capacitance sensing. J. Ambient Intell. Smart Environ. 2017, 9, 113–127. [Google Scholar] [CrossRef]
- Cheng, A.L.; Georgoulas, C.; Bock, T. Automation in Construction Fall Detection and Intervention based on Wireless Sensor Network Technologies. Autom. Constr. 2016, 71, 116–136. [Google Scholar] [CrossRef]
- Hossain, H.M.S.; Khan, M.A.A.H.; Roy, N. Active learning enabled activity recognition. Pervasive Mob. Comput. 2017, 38, 312–330. [Google Scholar] [CrossRef]
- Aziz, S.; Id, S.; Ren, A.; Id, D.F.; Zhang, Z.; Zhao, N.; Yang, X. Internet of Things for Sensing: A Case Study in the Healthcare System. Appl. Sci. 2018, 8, 508. [Google Scholar]
- Jiang, J.; Pozza, R.; Gunnarsdóttir, K.; Gilbert, N.; Moessner, K. Using Sensors to Study Home Activities. J. Sens. Actuator Netw. 2017, 6, 32. [Google Scholar] [CrossRef] [Green Version]
- Luo, X.; Guan, Q.; Tan, H.; Gao, L.; Wang, Z.; Luo, X. Simultaneous Indoor Tracking and Activity Recognition Using Pyroelectric Infrared Sensors. Sensors 2017, 17, 1738. [Google Scholar] [CrossRef] [PubMed]
- Gill, S.; Seth, N.; Scheme, E. A multi-sensor matched filter approach to robust segmentation of assisted gait. Sensors (Switzerland) 2018, 18, 2970. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Sasakawa, D. Human Posture Identification Using a MIMO Array. Electronics 2018, 7, 37. [Google Scholar] [CrossRef] [Green Version]
- Suyama, T. A network-type brain machine interface to support activities of daily living. IEICE Trans. Commun. 2016, E99B, 1930–1937. [Google Scholar] [CrossRef] [Green Version]
- Li, W.; Tan, B.O.; Piechocki, R. Passive Radar for Opportunistic Monitoring in E-Health Applications. IEEE J. Trans. Eng. Health Med. 2018, 6, 1–10. [Google Scholar] [CrossRef] [PubMed]
Interest Area | Common Term from VOSviewer | Duplication Frequency | Chosen Terms | Primary Query Strings |
---|---|---|---|---|
Ambient assisted living (AAL) | ALL | 11 | AAL Ambient assisted Assistance Assistive | AAL query: AAL OR “ambient assisted” OR assistance OR assistive |
Ambient assisted | 11 | |||
Assisted | 4 | |||
Ambient | 4 | |||
Ambient assisted living | 3 | |||
Assisted technology | 2 | |||
ALL platform | 1 | |||
ALL service | 1 | |||
ALL system | 1 | |||
Smart home (SH) | Smart home | 9 | Smart home Environment Device House | SH query: Smart AND (home OR environment OR house OR device) OR intelligence |
Smart home technology | 6 | |||
Smart home system | 5 | |||
Smart home device | 3 | |||
Smart house | 1 | |||
Smart device | 1 | |||
Smart environment (SE) | Smart environment | 6 | Smart, environment, intelligence, home | |
Home environment | 3 | |||
Intelligent environment | 2 | |||
Smart environment | 1 | |||
Intelligence | 1 | |||
Activity recognition (AR) | Activity | 18 | Activity Recognition “Human activity” “Human action” “Event detection” Action | AR query: Activity OR recognition OR “human activity” OR action OR “human action” OR “event detection” |
Recognition | 14 | |||
Human activity | 7 | |||
Human activity recognition | 4 | |||
Activity recognition system | 3 | |||
Action recognition | 2 | |||
Human action recognition | 2 | |||
Recognition system | 1 | |||
Human action | 1 |
Wearable Technology Used | Context of the Proposal | AR Solution | |||||||
---|---|---|---|---|---|---|---|---|---|
Model | Type | Sensor | Body Part | Combination | Applications | Target | Commercial | Developed | Ref. |
Customized | Wearable sensor band | Accelerometer + heart rate sensor | Chest + limb | Video | Generic AR applications | All | X | [38] | |
Customized | Accelerometer + Gyroscope + Magnetometer | Arms | Smartphone | Generic AR applications + Localization | Elderly | X | [47] | ||
Customized | Accelerometer + Gyroscope | Arm | - | Generic AR applications | Health | X | [62] | ||
Customized | Accelerometer | Hand | - | Emotion recognition | Health | X | [63] | ||
Customized | Skin sensor | Electro-dermal activity (EDA) | Skin | Video | Emotion recognition | Health | X | [52] | |
Google Glass Explorer | SmartGlass | Video capture | Head | - | Localization | Elderly | X | [46] | |
Google Glass-based + Head tracking device + Empatica E3 sensor armband | SmartGlass + smart band | IMU + Audio + Video | Head + Arm | Electronic components + Smartphone + Video | Generic AR applications | Elderly | X | X | [33] |
Microsoft Band 2 | Smartwatch | Accelerometer | Arms | Smartphone | Generic AR applications | All | X | [43] | |
Fitbit + Intel Basis Peak | Heart rate monitoring + Skin temperature monitoring | Hand | Smartphone | Posture recognition | All | X | [44] | ||
HiCling | Optical sensor + Accelerometer + Captive skin touch sensor | Arms | Electronic Components + Smartphone | Fall detection | All | X | [34] | ||
NS | Accelerometer + Gyroscope | Arms | Video | Generic AR applications | All | X | [64] | ||
Pebble SmartWatch | 3-axis integer accelerometer | Arms | Smartphone | Fall detection | Elderly | X | [45] | ||
Samsung Galaxy Gear Live | Accelerometer + Heart rate sensor | Arms | Smartphone | Mobility | All | X | [42] | ||
Shimmer | Wearable sensor band | Accelerometer | Wrist | - | Generic AR applications | Elderly | X | [39] | |
Shimmer | Accelerometer + Gyroscope | Abs | - | Fall detection | Elderly | X | [40,52] | ||
Shimmer | Accelerometer + Gyroscope | Wrist | Video | Generic AR applications | Elderly | X | [41] | ||
WiSE | Accelerometer | Arms | - | Generic AR applications | Sport | X | [65] | ||
WiSE | Electrodes + Accelerometer | Arms | - | Generic AR applications | Sport | X | [66] | ||
Microsoft Sens Cam | Wearable camera | Video capture | Chest | - | Generic AR applications | All | X | [67] |
Smartphone Uses | Context of the Proposal | AR Solution | |||||
---|---|---|---|---|---|---|---|
Model | Sensor Applied | Combination | Applications | Target | Commercial | Developed | |
Smartphone | Data transmission | - | Generic AR applications | Elderly | X | [35] | |
Smartphone | Data transmission + App | Wearable | Generic AR applications + Localization | Elderly | X | [47] | |
Android | App | Video | Health conditions | All | X | [51] | |
Android | Accelerometer | Wearable | Posture recognition | All | X | [44] | |
Android | Mic | Wearable | Generic AR applications | All | X | [43] | |
Android | Data transmission + App | Electronic Components | Occupancy | All | X | [68] | |
Android | Data transmission + App | Wearable | Generic AR applications | Elderly | X | [45] | |
Google NEXUS 4 | Accelerometer | - | Generic AR applications | Health | X | [69] | |
Google NEXUS 5 | Accelerometer + Mic + Magnetometer | - | Occupancy | All | X | [70] | |
HTC802w | Accelerometer + GPS | Electronic Components + Wearable | Fall detection | All | X | [34] | |
IPod Touch | Accelerometer | - | Mobility | Disabled | X | [71] | |
LG Nexus 5 | Accelerometer + Mic + GPS + Wi-Fi | Wearable | Mobility | All | X | [42] | |
Samsung ATIV | Accelerometer Gyroscope | - | Posture recognition | All | X | [72] | |
Samsung Galaxy S4 | Accelerometer + Mic | Electronic Components + Wearable + Video | Generic AR applications | Elderly | X | X | [33] |
Xolo era 2x and Samsung GT57562 | Accelerometer | - | Generic AR applications | All | X | [73] |
Video Technology | Context of the Proposal | AR Solution | |||||
---|---|---|---|---|---|---|---|
Type | Model | Combination | Applications | Target | Commercial | Developed | Ref. |
RGB-D Sensor | Kinect | Assistive robotics | Care | All | X | [56] | |
Kinect | Wi-Fi | Generic AR applications | All | X | [54] | ||
Kinect | Wearable | Generic AR applications | All | X | [64] | ||
Kinect | - | Posture recognition | All | X | [74] | ||
Kinect | - | Care | Disabled + Elderly | X | [75] | ||
Kinect | - | Generic AR applications | All | X | [76] | ||
Kinect | Wearable | Generic AR applications | Elderly | X | [41] | ||
RGB-D Sensor + Vicon System camera | Kinect + Vicon System camera | - | Fall detection | Elderly | X | [49] | |
RGB-D sensor + Thermal camera | Thermal camera PI450 Grasshopper RGB GS3-U3-28S5C-C FLIR | - | Fall detection | Elderly | X | [48] | |
Thermal camera | FLIR One for Android | Smartphone | Health conditions | All | X | [51] | |
FLIR E60 thermal infrared camera | - | Care | Elderly | X | [50] | ||
Video camera | - | - | Fall detection | Elderly | X | [77] | |
- | Wearable | Emotion recognition | Health | X | [52] | ||
Video camera + Infrared camera | - | Wearable | Generic AR applications | All | X | [38] | |
Optical sensor | Agilent ADNS-3060 Optical mouse sensors | - | Care | Elderly | X | [78] |
Electronic Components Used | Context of the Proposal | AR Solution | |||||
---|---|---|---|---|---|---|---|
Technologies | Reference/Model | Combination | Applications | Target | Commercial | Developed | Ref. |
TAG RFID + RFID antennas + RFID reader | Smartrack FROG 3D RFID + RFID reader antennas + Impinj Speedway R-420 RFID reader | - | Fall detection | Elderly | X | [79] | |
Active tags | - | Smartphone + Wearable | Fall detection | All | X | [34] | |
Grid-EYE + Ultrasonic sensor + Arduino | Grid-EYE (AMG8853, Panasonic Inc.) hotspot detection + Ultrasonic HC-SR04 + Arduino Mega | - | Fall detection | Elderly | X | [80] | |
Gird-EYE + Rotational platform + Time of flight (ToF) ranging sensor + Arduino | Gird-EYE AMG 8853 Panasonic VL53L0X + Arduino Nano | - | Localization + Occupancy | All | X | [81] | |
HC-SR04 + PIR module + BLE module | - | - | Occupancy | All | X | [68] | |
Infrared camera Raspberry | Pupil Labs eye tracker Raspberry Pi 2 | Smartphone + Wearable | Generic AR applications | Elderly | X | X | [33] |
Microphone | - | - | Generic AR applications | All | X | [32] | |
Zigbee transceiver ultra-low-power microcontroller | CC2520 + MSP430F5438 chipsets. | - | Localization | All | X | [82] | |
Capacitive sensing | OpenCapSense sensing toolkit | - | Posture recognition | Health | X | [83] | |
XBee Pro + Series Pro 2B antennas + Laser diode | Part 2 XBee Pro + Series Pro 2B antennas + NR | - | Fall detection | Elderly | X | [84] | |
PIR sensors | - | - | Fall detection | Elderly | X | [31] | |
PIR sensor + Motion sensor + Data sharing device | NR sensor + PogoPlug | Generic AR applications | All | X | [85] | ||
S-band antenna + Omnidirectional | - | - | Generic AR applications | Health | X | [86] | |
Seeeduino + Temperature and humidity sensor + Light sensor + Ranging sensor + Microphone | Seeeduino Arch-Pro + HTU21D + Avago ADPS-9960 + GP2Y0A60SZ + Breakout board INMP401 | - | Generic AR applications | All | X | [87] | |
Sensor node consisting of nine PIR sensors arranged in a grid shape + CC2530 Zigbee module | CC2530 used to sample PIR signals and communicate with the sink node | - | Localization | Elderly | X | [88] | |
Strain gauge sensor + IMU sensor | SGT-1A/1000-TY13 StrainGauges + LSM9DS1 9axis IMU | - | Health conditions | Elderly | X | [89] | |
Measurement setup: low-noise amplifier (LNA), data-acquisition unit (DAQ) + Switching SP64T + Downconverter unit + Path antennas | - | Localization | Elderly | X | [90] | ||
Portable brain-activity measuring equipment NIRS-EEG probes and NIRS-EEG unit + Thermometer + Laser range finder + Kinect + Pyroelectric sensor + Wireless LAN system + Sensor arrangement cameras + Microphones + Infrared devices | - | Mobility | Disabled | X | [91] | ||
Tunable RF transceivers NI USRP-2920 + MIMO cable Wireless energy transmitter + PCB antennas | - | Generic AR applications | Health | X | [92] |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Sanchez-Comas, A.; Synnes, K.; Hallberg, J. Hardware for Recognition of Human Activities: A Review of Smart Home and AAL Related Technologies. Sensors 2020, 20, 4227. https://doi.org/10.3390/s20154227
Sanchez-Comas A, Synnes K, Hallberg J. Hardware for Recognition of Human Activities: A Review of Smart Home and AAL Related Technologies. Sensors. 2020; 20(15):4227. https://doi.org/10.3390/s20154227
Chicago/Turabian StyleSanchez-Comas, Andres, Kåre Synnes, and Josef Hallberg. 2020. "Hardware for Recognition of Human Activities: A Review of Smart Home and AAL Related Technologies" Sensors 20, no. 15: 4227. https://doi.org/10.3390/s20154227