Ambient Sensors for Elderly Care and Independent Living: A Survey
<p>Percentages of persons of different ages in the world in different years [<a href="#B1-sensors-18-02027" class="html-bibr">1</a>].</p> "> Figure 2
<p>Numbers of persons (millions) aged over 60 years in the world in different years [<a href="#B1-sensors-18-02027" class="html-bibr">1</a>].</p> "> Figure 3
<p>Numbers of persons (millions) aged over 80 years in the world in different years [<a href="#B1-sensors-18-02027" class="html-bibr">1</a>].</p> "> Figure 4
<p>Sample schematic setup of a smart apartment for elderly care based on different ambient sensors.</p> "> Figure 5
<p>Different domains of ambient sensor-based elder care systems.</p> "> Figure 6
<p>Sample schematic setup for elderly care based on different ambient sensors and a mobile robot.</p> ">
Abstract
:1. Introduction
1.1. Surveys on Ambient Assisted Living
1.2. Ambient Assisted Living Projects
1.3. Privacy and Sensitive Data Protection
1.4. Article Searching Method
1.5. Contribution and Organization of the Paper
2. Ambient Sensors in Elderly Care
2.1. Passive Infrared (PIR) Motion Sensors
2.2. Video Sensors
2.3. Pressure Sensors
2.4. Sound Sensors
2.5. Floor Sensors
2.6. Radar Sensors
2.7. Combined Ambient Sensors
2.8. Combined Ambient and Wearable Sensors
2.9. Ambient Sensors in Mobile Robotic Systems
3. Future Direction and Vision
4. Conclusions
Funding
Conflicts of Interest
References
- United Nations. World Population Ageing; United Nations: New York, NY, USA, 2015. [Google Scholar]
- Duque, G. Age-Related Physical and Physiologic Changes and Comorbidities in Older People: Association with Falls. In Medication-Related Falls in Older People; Huang, A.R., Mallet, L., Eds.; Springer: Berlin, Germany, 2016; Chapter 6; pp. 67–73. [Google Scholar]
- Frieson, C.W. Predictors of Recurrent Falls in Community-Dwelling Older Adults after Fall-Related Hip Fracture. J. Perioper. Crit. Intensive Care Nurs. 2016, 2, e107. [Google Scholar]
- Singh, M.A.F. Exercise, nutrition and managing hip fracture in older persons. Current Opin. Clin. Nutr. Metab. Care 2013, 17, 12–24. [Google Scholar]
- Alwan, M.; Dalal, S.; Mack, D.; Kell, S.; Turner, B.; Leachtenauer, J.; Felder, R. Impact of Monitoring Technology in Assisted Living: Outcome Pilot. IEEE Trans. Inf. Technol. Biomed. 2006, 10, 192–198. [Google Scholar] [CrossRef] [PubMed]
- Scanaill, C.N.; Carew, S.; Barralon, P.; Noury, N.; Lyons, D.; Lyons, G.M. A Review of Approaches to Mobility Telemonitoring of the Elderly in Their Living Environment. Ann. Biomed. Eng. 2006, 34, 547–563. [Google Scholar] [CrossRef] [PubMed]
- Perry, M.; Dowdall, A.; Lines, L.; Hone, K. Multimodal and ubiquitous computing systems: Supporting independent-living older users. IEEE Trans. Inf. Technol. Biomed. 2004, 8, 258–270. [Google Scholar] [CrossRef] [PubMed]
- Al-Shaqi, R.; Mourshed, M.; Rezgui, Y. Progress in ambient assisted systems for independent living by the elderly. SpringerPlus 2016, 5. [Google Scholar] [CrossRef] [PubMed]
- Ni, Q.; Hernando, A.G.; de la Cruz, I. The Elderly’s Independent Living in Smart Homes: A Characterization of Activities and Sensing Infrastructure Survey to Facilitate Services Development. Sensors 2015, 15, 11312–11362. [Google Scholar] [CrossRef] [PubMed]
- Alam, M.R.; Reaz, M.B.I.; Ali, M.A.M. A Review of Smart Homes—Past, Present, and Future. IEEE Trans. Syst. Man Cybern. Part C Appl. Rev. 2012, 42, 1190–1203. [Google Scholar] [CrossRef]
- Rashidi, P.; Mihailidis, A. A Survey on Ambient-Assisted Living Tools for Older Adults. IEEE J. Biomed. Health Inf. 2013, 17, 579–590. [Google Scholar] [CrossRef]
- Salih, A.S.M.; Abraham, A. A review of ambient intelligence assisted healthcare monitoring. Int. J. Comput. Inf. Syst. Ind. Manag. Appl. 2013, 5, 741–750. [Google Scholar]
- Peetoom, K.K.B.; Lexis, M.A.S.; Joore, M.; Dirksen, C.D.; de Witte, L.P. Literature review on monitoring technologies and their outcomes in independently living elderly people. Disabil. Rehabil. Assist. Technol. 2014, 10, 271–294. [Google Scholar] [CrossRef] [PubMed]
- Khusainov, R.; Azzi, D.; Achumba, I.; Bersch, S. Real-Time Human Ambulation, Activity, and Physiological Monitoring: Taxonomy of Issues, Techniques, Applications, Challenges and Limitations. Sensors 2013, 13, 12852–12902. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Avci, A.; Bosch, S.; Marin-Perianu, M.; Marin-Perianu, R.; Havinga, P. Activity Recognition Using Inertial Sensing for Healthcare, Wellbeing and Sports Applications: A Survey. In Proceedings of the 23th International Conference on Architecture of Computing Systems, Hannover, Germany, 22–23 Feburary 2010; pp. 1–10. [Google Scholar]
- Bulling, A.; Blanke, U.; Schiele, B. A tutorial on human activity recognition using body-worn inertial sensors. ACM Comput. Surv. 2014, 46, 1–33. [Google Scholar] [CrossRef]
- Pantelopoulos, A.; Bourbakis, N.G. A Survey on Wearable Sensor-Based Systems for Health Monitoring and Prognosis. IEEE Trans. Syst. Man Cybern. Part C Appl. Rev. 2010, 40, 1–12. [Google Scholar] [CrossRef] [Green Version]
- Acampora, G.; Cook, D.J.; Rashidi, P.; Vasilakos, A.V. A Survey on Ambient Intelligence in Healthcare. Proc. IEEE 2013, 101, 2470–2494. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Helal, S.; Mann, W.; El-Zabadani, H.; King, J.; Kaddoura, Y.; Jansen, E. The Gator Tech Smart House: A programmable pervasive space. Computer 2005, 38, 50–60. [Google Scholar] [CrossRef]
- Cook, D.J.; Crandall, A.S.; Thomas, B.L.; Krishnan, N.C. CASAS: A Smart Home in a Box. Computer 2013, 46, 62–69. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Vacher, M.; Lecouteux, B.; Istrate, D.; Joubert, T.; Portet, F.; Sehili, M.; Chahuara, P. Experimental Evaluation of Speech Recognition Technologies for Voice-based Home Automation Control in a Smart Home. In Proceedings of the 4th workshop on speech and language processing for assistive technologies, Grenoble, France, 21–22 August 2013; pp. 99–105. [Google Scholar]
- Chahuara, P.; Portet, F.; Vacher, M. Making Context Aware Decision from Uncertain Information in a Smart Home: A Markov Logic Network Approach. In Proceedings of the 4th International Joint Conference, Dublin, Ireland, 3–5 December 2013; pp. 78–93. [Google Scholar]
- Antoniou, P.E.; Konstantinidis, E.I.; Billis, A.S.; Bamidis, P.D. Integrating the USEFIL Assisted Living Platform; Observation from the Field. In Proceedings of the 6th European Conference of the International Federation for Medical and Biological Engineering, Dubrovnik, Croatia, 7–11 September 2014; pp. 657–660. [Google Scholar]
- Billis, A.S.; Papageorgiou, E.I.; Frantzidis, C.; Konstantinidis, E.I.; Bamidis, P.D. Towards a hierarchically-structured decision support tool for improving seniors’ independent living: The USEFIL decision support system. In Proceedings of the 6th International Conference on Pervasive Technologies Related to Assistive Environments, Rhodes, Greece, 29–31 May 2013; pp. 1–4. [Google Scholar]
- Zhang, Q.; Su, Y.; Yu, P. Assisting an Elderly with Early Dementia Using Wireless Sensors Data in Smarter Safer Home. In Proceedings of the 15th IFIP WG 8.1 International Conference on Informatics and Semiotics in Organisations, Shanghai, China, 23–24 May 2014; pp. 398–404. [Google Scholar]
- Home Datasets List. Available online: http://boxlab.wikispaces.com/List+of+Home+Datasets (accessed on 5 October 2017).
- Costa, Â.; Andrade, F.; Novais, P. Privacy and Data Protection towards Elderly Healthcare. In Handbook of Research on ICTs for Human-Centered Healthcare and Social Care Services; IGI Global: Hershey, PA, USA, 2013; pp. 330–346. [Google Scholar]
- Rouvroy, A. Privacy, Data Protection, and the Unprecedented Challenges of Ambient Intelligence. In Studies in Ethics, Law, and Technology; Walter de Gruyter GmbH: Berlin, Germany, 2008. [Google Scholar]
- De Hert, P.; Gutwirth, S.; Moscibroda, A.; Wright, D.; Fuster, G.G. Legal safeguards for privacy and data protection in ambient intelligence. Pers. Ubiquitous Comput. 2008, 13, 435–444. [Google Scholar] [CrossRef]
- Rouvroy, A.; Poullet, Y. The Right to Informational Self-Determination and the Value of Self-Development: Reassessing the Importance of Privacy for Democracy. In Reinventing Data Protection? Springer: Berlin, Germany, 2009; pp. 45–76. [Google Scholar]
- Winn, J.K. Technical Standards as Data Protection Regulation. In Reinventing Data Protection? Springer: Berlin, Germany, 2009; pp. 191–206. [Google Scholar]
- Alwan, M.; Leachtenauer, J.; Dalal, S.; Kell, S.; Turner, B.; Mack, D.; Felder, R. Validation of rule-based inference of selected independent activities of daily living. Telemed. e-Health 2005, 11, 594–599. [Google Scholar] [CrossRef] [PubMed]
- Austin, D.; Hayes, T.L.; Kaye, J.; Mattek, N.; Pavel, M. On the disambiguation of passively measured in-home gait velocities from multi-person smart homes. J. Ambient. Intell. Smart Environ. 2011, 3, 165–174. [Google Scholar] [PubMed]
- Austin, D.; Hayes, T.L.; Kaye, J.; Mattek, N.; Pavel, M. Unobtrusive monitoring of the longitudinal evolution of in-home gait velocity data with applications to elder care. In Proceedings of the 2011 Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Boston, MA, USA, 30 August–3 September 2011; pp. 6495–6498. [Google Scholar]
- Barger, T.S.; Brown, D.E.; Alwan, M. Health-Status Monitoring Through Analysis of Behavioral Patterns. IEEE Trans. Syst. Man Cybern. Part A Syst. Hum. 2005, 35, 22–27. [Google Scholar] [CrossRef]
- Celler, B.G.; Earnshaw, W.; Ilsar, E.D.; Betbeder-Matibet, L.; Harris, M.F.; Clark, R.; Hesketh, T.; Lovell, N.H. Remote monitoring of health status of the elderly at home. A multidisciplinary project on aging at the University of New South Wales. Int. J. Bio-Med Comput. 1995, 40, 147–155. [Google Scholar] [CrossRef]
- Cook, D.J.; Schmitter-Edgecombe, M. Assessing the Quality of Activities in a Smart Environment. Methods Inf. Med. 2009, 48, 480–485. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Dalai, S.; Alwan, M.; Seifrafi, R.; Kell, S.; Brown, D. A Rule-Based Approach to the Analysis of Elders’ Activity Data: Detection of Health and Possible Emergency Conditions. In Proceedings of the AAAI Fall 2005 Symposium, Arlington, VA, USA, 4–6 November 2005; pp. 2545–2552. [Google Scholar]
- Demongeot, J.; Virone, G.; Duchêne, F.; Benchetrit, G.; Hervé, T.; Noury, N.; Rialle, V. Multi-sensors acquisition, data fusion, knowledge mining and alarm triggering in health smart homes for elderly people. Comptes Rendus Boil. 2002, 325, 673–682. [Google Scholar] [CrossRef]
- Fernández-Luque, F.J.; Zapata, J.; Ruiz, R. A system for ubiquitous fall monitoring at home via a wireless sensor network. In Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology, Buenos Aires, Argentina, 31 August–4 September 2010; pp. 2246–2249. [Google Scholar]
- Franco, C.; Demongeot, J.; Villemazet, C.; Vuillerme, N. Behavioral Telemonitoring of the Elderly at Home: Detection of Nycthemeral Rhythms Drifts from Location Data. In Proceedings of the IEEE 24th International Conference on Advanced Information Networking and Applications Workshops, Perth, WA, Australia, 20–23 April 2010; pp. 759–766. [Google Scholar]
- Glascock, A.; Kutzik, D. The impact of behavioral monitoring technology on the provision of health care in the home. J. Univ. Comput. Sci. 2006, 12, 59–79. [Google Scholar]
- Glascock, A.P.; Kutzik, D.M. Behavioral Telemedicine: A New Approach to the Continuous Nonintrusive Monitoring of Activities of Daily Living. Telemed. J. 2000, 6, 33–44. [Google Scholar] [CrossRef]
- Hagler, S.; Austin, D.; Hayes, T.L.; Kaye, J.; Pavel, M. Unobtrusive and Ubiquitous In-Home Monitoring: A Methodology for Continuous Assessment of Gait Velocity in Elders. IEEE Trans. Biomed. Eng. 2010, 57, 813–820. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Hayes, T.L.; Pavel, M.; Kaye, J.A. An unobtrusive in-home monitoring system for detection of key motor changes preceding cognitive decline. In Proceedings of the 26th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, San Francisco, CA, USA, 1–5 September 2004; pp. 2480–2483. [Google Scholar]
- Johnson, J. Consumer Response to Home Monitoring: A Survey of Older Consumers and Informal Care Providers; University of Florida: Gainesville, FL, USA, 2009. [Google Scholar]
- Kaushik, A.R.; Lovell, N.H.; Celler, B.G. Evaluation of PIR Detector Characteristics for Monitoring Occupancy Patterns of Elderly People Living Alone at Home. In Proceedings of the 29th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Lyon, France, 22–26 August 2007; pp. 3802–3805. [Google Scholar]
- Kaye, J. Intelligent systems for assessment of aging changes (ISAAC): Deploying unobtrusive home-based technology. Gerontechnology 2010, 9, 121. [Google Scholar] [CrossRef]
- Lee, S.-W.; Kim, Y.-J.; Lee, G.-S.; Cho, B.-Q.; Lee, N.-H. A remote behavioral monitoring system for elders living alone. In Proceedings of the International Conference on Control, Automation and Systems, Seoul, South Korea, 17–20 October 2007; pp. 2725–2730. [Google Scholar]
- Noury, N.; Hadidi, T. Computer simulation of the activity of the elderly person living independently in a Health Smart Home. Comput. Methods Programs Biomed. 2012, 108, 1216–1228. [Google Scholar] [CrossRef] [PubMed]
- Shin, J.H.; Lee, B.; Park, K.S. Detection of Abnormal Living Patterns for Elderly Living Alone Using Support Vector Data Description. IEEE Trans. Inf. Technol. Biomed. 2011, 15, 438–448. [Google Scholar] [CrossRef] [PubMed]
- Tomita, M.R.; Mann, W.C.; Stanton, K.; Tomita, A.D.; Sundar, V. Use of Currently Available Smart Home Technology by Frail Elders. Top. Geriatr. Rehabilit. 2007, 23, 24–34. [Google Scholar] [CrossRef]
- Virone, G. Assessing everyday life behavioral rhythms for the older generation. Pervasive Mob. Comput. 2009, 5, 606–622. [Google Scholar] [CrossRef]
- Wang, S.; Skubic, M.; Zhu, Y. Activity Density Map Visualization and Dissimilarity Comparison for Eldercare Monitoring. IEEE Trans. Inf. Technol. Biomed. 2012, 16, 607–614. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Willems, C.G.; Spreeuwenberg, M.D.; Heide, L.V.D.; Glascock, A.P.; Kutzik, D.L.; Witte, L.D.; Rietman, J. Activity Monitoring to Support Independent Living in Dutch Homecare Support; AAATE: Maastricht, The Netherlands, 2011. [Google Scholar]
- Abidine, M.; Fergani, B. News Schemes for Activity Recognition Systems Using PCA-WSVM, ICA-WSVM, and LDA-WSVM. Information 2015, 6, 505–521. [Google Scholar] [CrossRef]
- Aertssen, J.; Rudinac, M.; Jonker, P. Fall and Action Detection in Elderly Homes; AAATE: Maastricht, The Netherlands, 2011. [Google Scholar]
- Auvinet, E.; Reveret, L.; St-Arnaud, A.; Rousseau, J.; Meunier, J. Fall detection using multiple cameras. In Proceedings of the 30th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Vancouver, BC, Canada, 20–25 August 2008; pp. 2554–2557. [Google Scholar]
- Auvinet, E.; Multon, F.; Saint-Arnaud, A.; Rousseau, J.; Meunier, J. Fall Detection with Multiple Cameras: An Occlusion-Resistant Method Based on 3-D Silhouette Vertical Distribution. IEEE Trans. Inf. Technol. Biomed. 2011, 15, 290–300. [Google Scholar] [CrossRef] [PubMed]
- Belshaw, M.; Taati, B.; Giesbrecht, D.; Mihailidis, A. Intelligent vision-based fall detection system: Preliminary results from a realworld deployment. Rehabil. Eng. Assist. Technol. Soc. N. Am. 2011, 1–4. [Google Scholar] [CrossRef]
- Belshaw, M.; Taati, B.; Snoek, J.; Mihailidis, A. Towards a single sensor passive solution for automated fall detection. In Proceedings of the 2011 Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Boston, MA, USA, 30 August–3 September 2011; pp. 1773–1776. [Google Scholar]
- Berlin, S.J.; John, M. Human interaction recognition through deep learning network. In Proceedings of the IEEE International Carnahan Conference on Security Technology (ICCST), Orlando, FL, USA, 24–27 October 2016; pp. 1–4. [Google Scholar]
- Brulin, D.; Benezeth, Y.; Courtial, E. Posture Recognition Based on Fuzzy Logic for Home Monitoring of the Elderly. IEEE Trans. Inf. Technol. Biomed. 2012, 16, 974–982. [Google Scholar] [CrossRef] [PubMed]
- Chen, H.; Wang, G.; Xue, J.H.; He, L. A novel hierarchical framework for human action recognition. Pattern Recognit. 2016, 55, 148–159. [Google Scholar] [CrossRef] [Green Version]
- Lin, C.W.; Ling, Z.H. Automatic Fall Incident Detection in Compressed Video for Intelligent Homecare. In Proceedings of the 16th International Conference on Computer Communications and Networks, Honolulu, HI, USA, 13–16 August 2007; pp. 1172–1177. [Google Scholar]
- Du, Y.; Wang, W.; Wang, L. Hierarchical recurrent neural network for skeleton based action recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Boston, MA, USA, 8–10 June 2015; pp. 1110–1118. [Google Scholar]
- Foroughi, H.; Aski, B.S.; Pourreza, H. Intelligent video surveillance for monitoring fall detection of elderly in home environments. In Proceedings of the 11th International Conference on Computer and Information Technology, Khulna, Bangladesh, 24–27 December 2008; pp. 219–224. [Google Scholar]
- Huang, Z.; Wan, C.; Probst, T.; Gool, L.V. Deep Learning on Lie Groups for Skeleton-Based Action Recognition; arXiv Prepr; Cornell University Library: Ithaca, NY, USA, 2016. [Google Scholar]
- Kreković, M.; Čerić, P.; Dominko, T.; Ilijaš, M.; Ivančić, K.; Skolan, V.; Šarlija, J. A method for real-time detection of human fall from video. In Proceedings of the 35th International Convention MIPRO, Opatija, Croatia, 21–25 May 2012; pp. 1709–1712. [Google Scholar]
- Lan, Z.; Lin, M.; Li, X.; Hauptmann, A.G.; Raj, B. Beyond gaussian pyramid: Multi-skip feature stacking for action recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Boston, MA, USA, 8–10 June 2015; pp. 204–212. [Google Scholar]
- Li, Y.; Li, W.; Mahadevan, V.; Vasconcelos, N. Vlad3: Encoding dynamics of deep features for action recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 26 June–1 July 2016; pp. 1951–1960. [Google Scholar]
- Li, Y.; Ho, K.C.; Popescu, M. A Microphone Array System for Automatic Fall Detection. IEEE Trans. Biomed. Eng. 2012, 59, 1291–1301. [Google Scholar] [PubMed] [Green Version]
- Lee, Y.; Kim, J.; Son, M.; Lee, M. Implementation of Accelerometer Sensor Module and Fall Detection Monitoring System based on Wireless Sensor Network. In Proceedings of the 29th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Lyon, France, 22–26 August 2007; pp. 2315–2318. [Google Scholar]
- Lee, T.; Mihailidis, A. An intelligent emergency response system: Preliminary development and testing of automated fall detection. J. Telemed. Telecare 2005, 11, 194–198. [Google Scholar] [CrossRef] [PubMed]
- Lee, Y.-S.; Chung, W.-Y. Visual Sensor Based Abnormal Event Detection with Moving Shadow Removal in Home Healthcare Applications. Sensors 2012, 12, 573–584. [Google Scholar] [CrossRef] [PubMed]
- Leone, A.; Diraco, G.; Siciliano, P. Detecting falls with 3D range camera in ambient assisted living applications: A preliminary study. Med. Eng. Phys. 2011, 33, 770–781. [Google Scholar] [CrossRef] [PubMed]
- Mirmahboub, B.; Samavi, S.; Karimi, N.; Shirani, S. Automatic Monocular System for Human Fall Detection Based on Variations in Silhouette Area. IEEE Trans. Biomed. Eng. 2013, 60, 427–436. [Google Scholar] [CrossRef] [PubMed]
- Mo, L.; Li, F.; Zhu, Y.; Huang, A. Human physical activity recognition based on computer vision with deep learning model. In Proceedings of the IEEE International Instrumentation and Measurement Technology Conference Proceedings, Taipei, Taiwan, 23–26 May 2016; pp. 1–6. [Google Scholar]
- Peng, X.; Wang, L.; Wang, X.; Qiao, Y. Bag of Visual Words and Fusion Methods for Action Recognition: Comprehensive Study and Good Practice. In Computer Vision and Image Understanding; Elsevier: Amsterdam, The Netherlands, 2016; Volume 150, pp. 109–125. [Google Scholar]
- Peng, X.; Zou, C.; Qiao, Y.; Peng, Q. Action recognition with stacked fisher vectors. In Computer Vision–Asian Conference on Computer Vision–ECCV; Springer: Berlin, Germany, 2014; pp. 581–595. [Google Scholar]
- Rougier, C.; Meunier, J.; St-Arnaud, A.; Rousseau, J. Fall Detection from Human Shape and Motion History Using Video Surveillance. In Proceedings of the 21st International Conference on Advanced Information Networking and Applications Workshops (AINAW’07), Niagara Falls, ON, Canada, 21–23 May 2007; pp. 875–880. [Google Scholar]
- Shahroudy, A.; Ng, T.T.; Yang, Q.; Wang, G. Multimodal multipart learning for action recognition in depth videos. IEEE Trans. Pattern Anal. Mach. Intell. 2016, 38, 2123–2129. [Google Scholar] [CrossRef] [PubMed]
- Shi, Y.; Tian, Y.; Wang, Y.; Huang, T. Sequential deep trajectory descriptor for action recognition with three-stream CNN. IEEE Trans. Multimed. 2017, 19, 1510–1520. [Google Scholar] [CrossRef]
- Shieh, W.-Y.; Huang, J.-C. Falling-incident detection and throughput enhancement in a multi-camera video-surveillance system. Med. Eng. Phys. 2012, 34, 954–963. [Google Scholar] [CrossRef] [PubMed]
- Simonyan, K.; Zisserman, A. Two-stream convolutional networks for action recognition in videos. In Advances in Neural Information Processing Systems; MIT Press: Cambridge, MA, USA, 2014; pp. 568–576. [Google Scholar]
- Uddin, M.Z. Human activity recognition using segmented body part and body joint features with hidden Markov models. Multimed. Tools Appl. 2016, 76, 13585–13614. [Google Scholar] [CrossRef]
- Uddin, M.Z.; Khaksar, W.; Torresen, J. A robust gait recognition system using spatiotemporal features and deep learning. In Proceedings of the IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI), Daegu, Korea, 16–18 November 2017; pp. 156–161. [Google Scholar]
- Uddin, M.Z.; Khaksar, W.; Torresen, J. Human activity recognition using robust spatiotemporal features and convolutional neural network. In Proceedings of the IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI), Daegu, Korea, 16–18 November 2017; pp. 144–149. [Google Scholar]
- Veeriah, V.; Zhuang, N.; Qi, G.J. Differential recurrent neural networks for action recognition. In Proceedings of the IEEE International Conference on Computer Vision (ICCV), Santiago, Chile, 11–18 December 2015; pp. 4041–4049. [Google Scholar]
- Wang, J.; Liu, Z.; Wu, Y.; Yuan, J. Learning actionlet ensemble for 3D human action recognition. IEEE Trans. Pattern Anal. Mach. Intell. 2014, 36, 914–927. [Google Scholar] [CrossRef] [PubMed]
- Wang, P.; Li, W.; Gao, Z.; Zhang, J.; Tang, C.; Ogunbona, P.O. Action recognition from depth maps using deep convolutional neural networks. IEEE Trans. Hum. Mach. Syst. 2016, 46, 498–509. [Google Scholar] [CrossRef]
- Wang, P.; Li, W.; Gao, Z.; Tang, C.; Zhang, J.; Ogunbona, P. ConvNets-based action recognition from depth maps through virtual cameras and Pseudocoloring. In Proceedings of the 23rd ACM International Conference on Multimedia, Brisbane, Australia, 26–30 October 2015; pp. 1119–1122. [Google Scholar]
- Wang, L.; Qiao, Y.; Tang, X. Action recognition with trajectory-pooled deep-convolutional descriptors. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Boston, MA, USA, 8–10 June 2015; pp. 4305–4314. [Google Scholar]
- Willems, J.; Debard, G.; Vanrumste, B.; Goedemé, T. A Video-based Algorithm for Elderly Fall Detection. In Proceedings of the World Congress on Medical Physics and Biomedical Engineering, Munich, Germany, 7–12 September 2009; pp. 312–315. [Google Scholar]
- Yang, X.; Tian, Y. Super normal vector for human activity recognition with depth cameras. IEEE Trans. Pattern Anal. Mach. Intell. 2017, 39, 1028–1039. [Google Scholar] [CrossRef] [PubMed]
- Yu, M.; Rhuma, A.; Naqvi, S.M.; Wang, L.; Chambers, J. A Posture Recognition-Based Fall Detection System for Monitoring an Elderly Person in a Smart Home Environment. IEEE Trans. Inf. Technol. Biomed. 2012, 16, 1274–1286. [Google Scholar] [PubMed] [Green Version]
- Zhen, X.; Shao, L. Action recognition via spatio-temporal local features: A comprehensive study. Image Vis. Comput. 2016, 50, 1–13. [Google Scholar] [CrossRef] [Green Version]
- Zhu, W.; Lan, C.; Xing, J.; Zeng, W.; Li, Y.; Shen, L.; Xie, X. Co-Occurrence Feature Learning for Skeleton Based Action Recognition Using Regularized Deep LSTM Networks. AAAI 2016, 2, 8. [Google Scholar]
- Arcelus, A.; Herry, C.L.; Goubran, R.A.; Knoefel, F.; Sveistrup, H.; Bilodeau, M. Determination of Sit-to-Stand Transfer Duration Using Bed and Floor Pressure Sequences. IEEE Trans. Biomed. Eng. 2009, 56, 2485–2492. [Google Scholar] [CrossRef] [PubMed]
- Arcelus, A.; Holtzman, M.; Goubran, R.; Sveistrup, H.; Guitard, P.; Knoefel, F. Analysis of commode grab bar usage for the monitoring of older adults in the smart home environment. In Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Minneapolis, MN, USA, 3–6 September 2009; pp. 6155–6158. [Google Scholar]
- Arcelus, A.; Goubran, R.; Sveistrup, H.; Bilodeau, M.; Knoefel, F. Context-aware smart home monitoring through pressure measurement sequences. In Proceedings of the IEEE International Workshop on Medical Measurements and Applications, Ottawa, ON, Canada, 30 April–1 May 2010; pp. 32–37. [Google Scholar]
- Fleury, A.; Noury, N.; Vacher, M.; Glasson, H.; Seri, J.-F. Sound and speech detection and classification in a Health Smart Home. In Proceedings of the 30th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Vancouver, BC, Canada, 20–25 August 2008; pp. 4644–4647. [Google Scholar]
- Khan, M.S.; Yu, M.; Feng, P.; Wang, L.; Chambers, J. An unsupervised acoustic fall detection system using source separation for sound interference suppression. Signal Process. 2015, 110, 199–210. [Google Scholar] [CrossRef] [Green Version]
- Li, Y.; Zeng, Z.; Popescu, M.; Ho, K.C. Acoustic fall detection using a circular microphone array. In Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology, Buenos Aires, Argentina, 31 August–4 September 2010; pp. 2242–2245. [Google Scholar]
- Li, Y.; Popescu, M.; Ho, K.C.; Nabelek, D.P. Improving acoustic fall recognition by adaptive signal windowing. In Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Boston, MA, USA, 30 August–3 September 2011; pp. 7589–7592. [Google Scholar]
- Popescu, M.; Li, Y.; Skubic, M.; Rantz, M. An acoustic fall detector system that uses sound height information to reduce the false alarm rate. In Proceedings of the 30th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Vancouver, BC, Canada, 20–25 August 2008; pp. 4628–4631. [Google Scholar]
- Popescu, M.; Mahnot, A. Acoustic fall detection using one-class classifiers. In Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Minneapolis, MN, USA, 3–6 September 2009. [Google Scholar]
- Vacher, M.; Istrate, D.; Portet, F.; Joubert, T.; Chevalier, T.; Smidtas, S.; Meillon, B.; Lecouteux, B.; Sehili, M.; Chahuara, P.; et al. The sweet-home project: Audio technology in smart homes to improve well-being and reliance. In Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Boston, MA, USA, 30 August–3 September 2011; pp. 5291–5294. [Google Scholar]
- Zhuang, X.; Huang, J.; Potamianos, G.; Hasegawa-Johnson, M. Acoustic fall detection using Gaussian mixture models and GMM supervectors. In Proceedings of the IEEE International Conference on Acoustics, Speech and Signal Processing, Taipei, Taiwan, 19–24 April 2009; pp. 69–72. [Google Scholar]
- Alwan, M.; Rajendran, P.J.; Kell, S.; Mack, D.; Dalal, S.; Wolfe, M.; Felder, R. A Smart and Passive Floor-Vibration Based Fall Detector for Elderly. In Proceedings of the 2nd International Conference on Information & Communication Technologies, Damascus, Syria, 24–28 April 2006; pp. 1003–1007. [Google Scholar]
- Lombardi, M.; Vezzani, R.; Cucchiara, R. Detection of Human Movements with Pressure Floor Sensors. In Image Analysis and Processing—ICIAP; Murino, V., Puppo, E., Eds.; Springer: Berlin, Germany, 2015; pp. 620–630. [Google Scholar]
- Serra, R.; di Croce, P.; Peres, R.; Knittel, D. Human step detection from a piezoelectric polymer floor sensor using normalization algorithms. In Proceedings of the IEEE SENSORS, Valencia, Spain, 2–5 November 2014; pp. 1169–1172. [Google Scholar]
- Forouzanfar, M.; Mabrouk, M.; Rajan, S.; Bolic, M.; Dajani, H.R.; Groza, V.Z. Event Recognition for Contactless Activity Monitoring Using Phase-Modulated Continuous Wave Radar. IEEE Trans. Biomed. Eng. 2017, 64, 479–491. [Google Scholar] [CrossRef] [PubMed]
- Kim, Y.; Toomajian, B. Hand Gesture Recognition Using Micro-Doppler Signatures with Convolutional Neural Network. IEEE Access 2016, 4, 7125–7130. [Google Scholar] [CrossRef]
- Lien, J.; Gillian, N.; Karagozler, M.E.; Amihood, P.; Schwesig, C.; Olson, E.; Raja, H.; Poupyrev, I. Soli. ACM Trans. Graph. 2016, 35, 1–19. [Google Scholar] [CrossRef] [Green Version]
- Rui, L.; Chen, S.; Ho, K.C.; Rantz, M.; Skubic, M. Estimation of human walking speed by Doppler radar for elderly care. J. Ambient. Intell. Smart Environ. 2017, 9, 181–191. [Google Scholar] [CrossRef]
- Wan, Q.; Li, Y.; Li, C.; Pal, R. Gesture recognition for smart home applications using portable radar sensors. In Proceedings of the 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Chicago, IL, USA, 26–30 August 2014; pp. 6414–6417. [Google Scholar]
- Alwan, M.; Leachtenauer, J.; Dalal, S.; Mack, D.; Kell, S.; Turner, B.; Felder, R. Psychosocial Impact of Monitoring Technology in Assisted Living: A Pilot Study. In Proceedings of the 2nd International Conference on Information & Communication Technologies, Damascus, Syria, 24–28 April 2006; pp. 998–1002. [Google Scholar]
- Alwan, M.; Kell, S.; Turner, B.; Dalal, S.; Mack, D.; Felder, R. Psychosocial Impact of Passive Health Status Monitoring on Informal Caregivers and Older Adults Living in Independent Senior Housing. In Proceedings of the 2nd International Conference on Information & Communication Technologies, Damascus, Syria, 24–28 April 2006; pp. 808–813. [Google Scholar]
- Alwan, M.; Sifferlin, E.B.; Turner, B.; Kell, S.; Brower, P.; Mack, D.C.; Dalal, S.; Felder, R.A. Impact of Passive Health Status Monitoring to Care Providers and Payers in Assisted Living. Telemed. e-Health 2007, 13, 279–285. [Google Scholar] [CrossRef] [PubMed]
- Ariani, A.; Redmond, S.J.; Chang, D.; Lovell, N.H. Simulated Unobtrusive Falls Detection with Multiple Persons. IEEE Trans. Biomed. Eng. 2012, 59, 3185–3196. [Google Scholar] [CrossRef] [PubMed]
- Bamis, A.; Lymberopoulos, D.; Teixeira, T.; Savvides, A. Towards precision monitoring of elders for providing assistive services. In Proceedings of the 1st ACM international conference on PErvasive Technologies Related to Assistive Environments—PETRA’08, Athens, Greece, 16–18 July 2008; pp. 1–8. [Google Scholar]
- Bamis, A.; Lymberopoulos, D.; Teixeira, T.; Savvides, A. The BehaviorScope framework for enabling ambient assisted living. Pers. Ubiquitous Comput. 2010, 14, 473–487. [Google Scholar] [CrossRef] [Green Version]
- Celler, B.G.; Ilsar, E.D.; Earnshaw, W. Preliminary results of a pilot project on remote monitoring of functional health status in the home. In Proceedings of the 18th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Amsterdam, The Netherlands, 31 October–3 November 1996; pp. 63–64. [Google Scholar]
- Chung, K.; Song, K.; Shin, K.; Sohn, J.; Cho, S.; Chang, J.-H. Noncontact Sleep Study by Multi-Modal Sensor Fusion. Sensors 2017, 17, 1685. [Google Scholar] [CrossRef] [PubMed]
- Guettari, T.; Aguilar, P.A.C.; Boudy, J.; Medjahed, H.; Istrate, D.; Baldinger, J.-L.; Belfeki, I.; Opitz, M.; Maly-Persy, M. Multimodal localization in the context of a medical telemonitoring system. In Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology, Buenos Aires, Argentina, 31 August–4 September 2010; pp. 3835–3838. [Google Scholar]
- Kinney, J.M. Striving to Provide Safety Assistance for Families of Elders: The SAFE House Project. Dementia 2004, 3, 351–370. [Google Scholar] [CrossRef]
- Lotfi, A.; Langensiepen, C.; Mahmoud, S.M.; Akhlaghinia, M.J. Smart homes for the elderly dementia sufferers: Identification and prediction of abnormal behaviour. J. Ambient. Intell. Humaniz. Comput. 2011, 3, 205–218. [Google Scholar] [CrossRef]
- Rantz, M.; Skubic, M.; Miller, S.; Krampe, J. Using Technology to Enhance Aging in Place. In Proceedings of the International Conference on Smart Homes and Health Telematics, Singapore, 10–12 July 2018; pp. 169–176. [Google Scholar]
- Van Hoof, J.; Kort, H.S.M.; Rutten, P.G.S.; Duijnstee, M.S.H. Ageing-in-place with the use of ambient intelligence technology: Perspectives of older users. Int. J. Med. Inf. 2011, 80, 310–331. [Google Scholar] [CrossRef] [PubMed]
- Zhou, F.; Jiao, J.R.; Chen, S.; Zhang, D. A Case-Driven Ambient Intelligence System for Elderly in-Home Assistance Applications. IEEE Trans. Syst. Man Cybern. Part C Appl. Rev. 2011, 41, 179–189. [Google Scholar] [CrossRef]
- Zouba, N.; Bremond, F.; Thonnat, M.; Anfosso, A.; Pascual, É.; Malléa, P.; Mailland, V.; Guerin, O. A computer system to monitor older adults at home: Preliminary results. Gerontechnology 2009, 8, 129–139. [Google Scholar] [CrossRef]
- Zouba, N.; Bremond, F.; Thonnat, M. Multisensor Fusion for Monitoring Elderly Activities at Home. In Proceedings of the Sixth IEEE International Conference on Advanced Video and Signal Based Surveillance, Genova, Italy, 2–4 September 2009; pp. 98–103. [Google Scholar]
- Aghajan, H.; Augusto, J.C.; Wu, C.; McCullagh, P.; Walkden, J.-A. Distributed vision-based accident management for assisted living. In Proceedings of the 5th International Conference on Smart Homes and Health Telematics, Nara, Japan, 21–23 June 2007. [Google Scholar]
- Bang, S.; Kim, M.; Song, S.; Park, S.-J. Toward real time detection of the basic living activity in home using a wearable sensor and smart home sensors. In Proceedings of the 30th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Vancouver, BC, Canada, 20–25 August 2008; pp. 5200–5203. [Google Scholar]
- Bianchi, F.; Redmond, S.J.; Narayanan, M.R.; Cerutti, S.; Celler, B.G.; Lovell, N.H. Falls event detection using triaxial accelerometry and barometric pressure measurement. In Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Minneapolis, MN, USA, 3–6 September 2009; pp. 6111–6114. [Google Scholar]
- Cao, Y.; Tao, L.; Xu, G. An Event-driven Context Model in Elderly Health Monitoring. In Proceedings of the Symposia and Workshops on Ubiquitous, Autonomic and Trusted Computing, Brisbane, Australia, 7–9 July 2009; pp. 120–124. [Google Scholar]
- Hein, A.; Winkelbach, S.; Martens, B.; Wilken, O.; Eichelberg, M.; Spehr, J.; Gietzelt, M.; Wolf, K.H.; Busching, F.; Hulsken-Giesler, M.; et al. Monitoring systems for the support of home care. Inf. Health Soc Care 2010, 35, 157–176. [Google Scholar] [CrossRef] [PubMed]
- Medjahed, H.; Istrate, D.; Boudy, J.; Dorizzi, B. Human activities of daily living recognition using fuzzy logic for elderly home monitoring. In Proceedings of the IEEE International Conference on Fuzzy Systems, Jeju Island, Korea, 20–24 August 2009; pp. 2001–2006. [Google Scholar]
- Nyan, M.N.; Tay, F.E.H.; Tan, A.W.Y.; Seah, K.H.W. Distinguishing fall activities from normal activities by angular rate characteristics and high-speed camera characterization. Med. Eng. Phys. 2006, 28, 842–849. [Google Scholar] [CrossRef] [PubMed]
- Roy, P.C.; Bouzouane, A.; Giroux, S.; Bouchard, B. Possibilistic Activity Recognition in Smart Homes for Cognitively Impaired People. Appl. Artif. Intell. 2011, 25, 883–926. [Google Scholar] [CrossRef]
- Sim, K.; Phua, C.; Yap, G.; Biswas, J.; Mokhtari, M. Activity recognition using correlated pattern mining for people with dementia. In Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Boston, MA, USA, 30 August–3 September 2011; pp. 7593–7597. [Google Scholar]
- Srinivasan, S.; Han, J.; Lal, D.; Gacic, A. Towards automatic detection of falls using wireless sensors. In Proceedings of the 29th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Lyon, France, 22–26 August 2007; pp. 1379–1382. [Google Scholar]
- Tolkiehn, M.; Atallah, L.; Lo, B.; Yang, G.-Z. Direction sensitive fall detection using a triaxial accelerometer and a barometric pressure sensor. In Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Boston, MA, USA, 30 August–3 September 2011; pp. 369–372. [Google Scholar]
- Fukai, H.; Nishie, Y.; Abiko, K.; Mitsukura, Y.; Fukumi, M.; Tanaka, M. An age estimation system on the AIBO. In Proceedings of the International Conference on Control, Automation and Systems, Seoul, Korea, 14–17 October 2008; pp. 2551–2554. [Google Scholar]
- Noury, N. AILISA: Experimental platforms to evaluate remote care and assistive technologies in gerontology. In Proceedings of the 7th International Workshop on Enterprise networking and Computing in Healthcare Industry, Busan, Korea, 23–25 June 2005; pp. 67–72. [Google Scholar]
- Chul, H.; Ryul, J.H.; Stonier, K.D. The study of robot platform for orchestrating and reusing services. In Proceedings of the IEEE Workshop on Advanced Robotics and its Social Impacts, Seoul, Korea, 26–28 October 2010; pp. 162–164. [Google Scholar]
- Graf, B.; Reiser, U.; Hagele, M.; Mauz, K.; Klein, P. Robotic home assistant Care-O-bot 3 and innovation platform. In Proceedings of the IEEE Workshop on Advanced Robotics and its Social Impacts, Tokyo, Japan, 23–25 November 2009; pp. 139–144. [Google Scholar]
- Coradeschi, S.; Cesta, A.; Cortellessa, G.; Coraci, L.; Galindo, C.; Gonzalez, J.; Karlsson, L.; Forsberg, A.; Frennert, S.; Furfari, F.; et al. GiraffPlus: A System for Monitoring Activities and Physiological Parameters and Promoting Social Interaction for Elderly. Hum.-Comput. Syst. Interact. Backgr. Appl. 2014, 3, 261–271. [Google Scholar]
- Lera, F.J.; Rodríguez, V.; Rodríguez, C.; Matellán, V. Augmented Reality in Robotic Assistance for the Elderly. Intell. Syst. Control. Autom. Sci. Eng. 2013, 70, 3–11. [Google Scholar]
- Leite, I.; Pereira, A.; Martinho, C.; Paiva, A. Are emotional robots more fun to play with? In Proceedings of the RO-MAN—The 17th IEEE International Symposium on Robot and Human Interactive Communication, Munich, Germany, 1–3 August 2008; pp. 77–82. [Google Scholar]
- Stiehl, W.D.; Lieberman, J.; Breazeal, C.; Basel, L.; Lalla, L.; Wolf, M. Design of a therapeutic robotic companion for relational, affective touch. In Proceedings of the ROMAN IEEE International Workshop on Robot and Human Interactive Communication, Nashville, TN, USA, 13–15 August 2005; pp. 408–415. [Google Scholar]
- Aminuddin, R.; Sharkey, A.; Levita, L. Interaction with the Paro robot may reduce psychophysiological stress responses. In Proceedings of the 11th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Christchurch, New Zealand, 7–10 March 2016; pp. 593–594. [Google Scholar]
- Pollack, M.; Engberg, S.; Matthews, J.; Thrun, S.; Brown, L.; Colbry, D.; Orosz, C.; Peintner, B.; Ramakrishnan, S.; Dunbar-Jacob, J.; et al. Pearl: A mobile robotic assistant for the elderly. In Proceedings of the Workshop on Automation as Caregiver: The Role of Intelligent Technology in Elder Care (AAAI), Menlo Park, CA, USA, 28–29 July 2002; pp. 85–92. [Google Scholar]
- Onishi, M.; Luo, Z.; Odashima, T.; Hirano, S.; Tahara, K.; Mukai, T. Generation of Human Care Behaviors by Human-Interactive Robot RI-MAN. In Proceedings of the IEEE International Conference on Robotics and Automation, Roma, Italy, 10–14 April 2007; pp. 3128–3129. [Google Scholar]
- Gupta, P.; Verma, P.; Gupta, R.; Verma, B. MovAid—A novel device for advanced rehabilitation monitoring. In Proceedings of the 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Milan, Italy, 25–29 August 2015; pp. 4655–4658. [Google Scholar]
- Rodriguez-Losada, D.; Matia, F.; Jimenez, A.; Galan, R.; Lacey, G. Implementing Map Based Navigation in Guido, the Robotic Smart Walker. In Proceedings of the 2005 IEEE International Conference on Robotics and Automation, Barcelona, Spain, 18–22 April 2005. [Google Scholar]
- Kriglstein, S.; Wallner, G. HOMIE: An artificial companion for elderly people. In Proceedings of the CHI ’05 extended abstracts on Human factors in computing systems—CHI’05, Portland, OR, USA, 2–7 April 2005. [Google Scholar]
- Onishi, K. ‘wakamaru’, the Robot for Your Home. J. Soc. Mech. Eng. 2006, 109, 448–449. [Google Scholar] [CrossRef]
- Han, J.; Lee, S.; Kang, B.; Park, S.; Kim, J.; Kim, M.; Kim, M. A trial English class with a teaching assistant robot in elementary school. In Proceedings of the 2010 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Osaka, Japan, 2–5 March 2010. [Google Scholar]
- Kato, S.; Ohshiro, S.; Itoh, H.; Kimura, K. Development of a communication robot Ifbot. In Proceedings of the ICRA’04, 2004 IEEE International Conference on Robotics and Automation, New Orleans, LA, USA, 26 April–1 May 2004; pp. 697–702. [Google Scholar]
- Cheng, C. Fujitsu Talking Robot Teddy Bear: Hands-On with Video. PC Magazine. Available online: http://www.pcmag.com/article2/0,2817,2375443,00.asp (accessed on 22 June 2018).
- Stiehl, W.D.; Lieberman, J.; Breazeal, C.; Basel, L.; Cooper, R.; Knight, H.; Lalla, L.; Maymin, A.; Purchase, S. The huggable: A therapeutic robotic companion for relational, affective touch. In Proceedings of the 2006 3rd IEEE Consumer Communications and Networking Conference (CCNC 2006), Boston, MA, USA, 30 July–3 August 2006. [Google Scholar]
- “iCat Research Platform” iCaT Research Community. Available online: https://ercim-news.ercim.eu/en67/special-theme-embedded-intelligence/icat-a-friendly-robot-that-helps-children-and-grown-ups (accessed on 11 June 2018).
- XeThru Radar in Action. XeThru. Available online: https://www.xethru.com/ (accessed on 11 June 2018).
- Liu, X.; Cao, J.; Tang, S.; Wen, J.; Guo, P. Contactless Respiration Monitoring Via Off-the-Shelf WiFi Devices. IEEE Trans. Mob. Comput. 2016, 15, 2466–2479. [Google Scholar] [CrossRef]
- Vimarlund, V.; Wass, S. Big Data, Smart Homes and Ambient Assisted Living. IMIA Yearb. 2014, 9, 143–149. [Google Scholar] [CrossRef] [PubMed]
- Linskell, J. Smart Home Technology and Special Needs: Reporting UK activity and Sharing Implemention Experiences from Scotland. In Proceedings of the 5th International ICST Conference on Pervasive Computing Technologies for Healthcare, Dublin, Ireland, 23–26 May 2011; pp. 287–291. [Google Scholar]
Sensor | Type | Characteristics | Cost ($) |
---|---|---|---|
Magnetic switch | Ambient | The binary-status-providing sensors are easily installable. They are mainly used to detect the opening of doors, windows, etc. | 5 ± 0.75 |
Temperature sensor | Ambient | The continuous-data-providing sensors detect the temperature of the ambient environment. | 9 ± 2 |
Photosensor | Ambient | The sensors detect illuminance and provide continuous data. | 5 ± 1.25 |
Pressure pad sensor | Ambient | The sensors provide continuous pressure measurement at any surface. | 25 ± 5 |
Water flow sensor | Ambient | The sensors continuously measure the flow of water in taps or showers. | 24 ± 3 |
Infrared motion sensor | Ambient | The binary-status-providing sensors detect motion in the coverage area. | 35 ± 2 |
Force sensor | Ambient | Detects movement and falls | 33 ± 5 |
Smoke sensor | Ambient | The binary-status-providing sensors detect smoke in the environment. | 18 ± 6 |
Biosensor | Wearable | The sensors monitor vital signs and require professional adjustment. They are difficult to install. | 180.00 ± 5.00 |
Research Authors (Year) | Target | Research Techniques | Results |
---|---|---|---|
Alwan et al. [32] (2005) | Recognition of activities of daily living | The work used the following approaches: Rule-based recognition of activities (e.g., eating and showering); Fifteen on/off switches in different places, such as the microwave oven and different doors; Binary features (on/off) were used for rule-based recognition of activities of daily living; More than five weeks of activity monitoring; Subjects were provided portable personal digital assistant (PDA) devices for recording ground truth data. | 91% sensitivity; 100% specificity. |
Austin et al. [33] (2011) | Gait analysis | The work used Gaussian mixture modeling on motion sensor data for three years of residence monitoring of different people. | 95% accuracy. |
Austin et al. [34] (2011) | Gait analysis | The authors applied Gaussian-kernel-based probability density functions for three years of monitoring of two elderly subjects. | The approach detects abrupt changes in gait function and slower variations of gait velocity over time. |
Barger et al. [35] (2005) | Recognition of activities of daily living | The work described probabilistic mixture model raw motion sensor data for recognition of different activities. Subjects were monitored for 65 days. Then, results were accumulated. The project utilized of a set of low-cost motion sensors. Two types of evaluations were performed: work and off-days. | The motion sensor data were grouped into 139 clusters. The experimental results showed that there were some frequent clusters that occurred consistently over time with low classification uncertainty. |
Celler [36] (1995) | Recognition of activities of daily living | It was a pilot project with five months of monitoring the functional health status of the elderly at home. Parameters that are sensitive to changes in health were continuously recorded. | The project explained the technical functionality for monitoring the functional health status of the elderly in the smart home. |
Cook & Schmitter-Edgecombe [37] (2009) | Recognition of activities of daily living | The work adopted Markov models for modeling daily activities. | 98% accuracy. |
Dalal et al. [38] (2005) | Recognition of activities of daily living | The work adopted rule-based recognition based on correlation algorithms. Each elderly person was monitored for 37 days. | 91% sensitivity; 100% specificity. |
Demongeot et al. [39] (2002) | Recognition of activities of daily living | The authors applied mostly threshold features for rule-based recognition. | Only analytical studies were performed, rather than reporting accuracies of proposed approaches. |
Fernandez-Llatas et al. [40] (2010) | Recognition of activities of daily living | Simple rules were applied to an ongoing project to focus on various daily activities. | The work was only an analysis of an ongoing project, which was carried out to test different approaches without reporting any specific results. |
Franco et al. [41] (2010) | Recognition of activities of daily living | The work used circular Hamming distance based on temporal shift, which was applied to monitor elderly persons for 49 days. | Different days were considered to explain the functionality. |
Glascock & Kutzik [42] (2006) | Recognition of activities of daily living | The work applied Gaussian mixtures to model human activities. The study was performed on two field sites, where elderly monitoring was carried out for a half a year and a full year. | 98% reliability. |
Glascock & Kutzik [43] (2000) | Recognition of activities of daily living | Multiple activities were annotated based on specific software to monitor behavior. Elderly monitoring was performed for 12 days. | The functionality of the behavior monitoring system was elaborated for different days. It can be used in eldercare centers to obtain temporal information based on behavioral variations. |
Hagler et al. [44] (2010) | Gait recognition | A simulation study was performed on gait analysis in a predefined laboratory setting. | 98.9% accuracy. |
Hayes et al. [45] (2004) | Recognition of activities of daily living | A Gaussian-kernel-based approach was described that was based on probability density functions for describing walking in-home. Eight weeks of monitoring of walking was carried out. | 98.1% accuracy. |
Kaye et al. [48] (2010) | Recognition of activities of daily living and gait | For an average of 33 months, different types of sensors were installed in the homes of 265 elderly people. Different metrics were assessed, such as total daily activity, time out of the home, and walking speed. Participants were also assessed yearly with questionnaires, physical examinations, and neuropsychological tests. | Elderly people left their homes twice a day on average for approximately 208 min per day. Average in-home walking speed was 61.0 cm/s. They spent 43% of days on the computer for an average of 76 min per day. |
Lee et al. [49] (2007) | Recognition of activities of daily living | A behavioral monitoring system was developed for elderly people who are living alone. The PIR-sensor-based in-house sensing system could detect the motion of an elder and send the data to a database. In addition, a web-based monitoring system was developed for remote monitoring of the elderly by caregivers. The system was installed in nine elderly homes for three months. | 86.6% accuracy. |
Noury & Haddidi [50] (2012) | Recognition of activities of daily living | A simulator was proposed that focuses on human activities based on presence sensors in the smart home for elderly healthcare. Previously recorded real activity data were used to build a mathematical model that was based on HMMs for producing simulated data series for various scenarios. In addition, similarity measurements were obtained between real and simulated data. | 99.91% accuracy. |
Shin et al. [51] (2011) | Recognition of activities of daily living | Several sensors were installed in different places in a smart home to monitor abnormal activity patterns. Observations were made for 51 and 157 days. | 90.5% accuracy. |
Tomita et al. [52] (2007) | Recognition of activities of daily living | A case study was performed for two years of elderly monitoring in smart homes. | 91% recommendation. |
Virone [53] (2009) | Recognition of activities of daily living | It was a simulated case study in which a pattern recognition model for daily activity monitoring was tested. Activity deviation was also considered during activity monitoring. | 98% accuracy. |
Wang et al. [54] (2012) | Recognition of activities of daily living | Activity pattern deviations were considered for early detection of health changes. Dissimilarities among different activity density maps were computed to automatically determine changes in activity patterns. Elderly subjects were monitored for one, four, and three months. | Dissimilarities among activity density maps were in the range of 0.30–0.52. |
Willems et al. [55] (2011) | Recognition of activities of daily living | A pilot study was performed to examine potential effects of activity monitoring on users and formal and informal caregivers. The study was performed based on the observations from two years of monitoring. Various questionnaires were used to assess quality of life and health status. | The functionality of the system was illustrated in detail. After the assessment, no significant variations were found based on the client questionnaires. |
Research Authors (Year) | Purpose | Characteristics | Outcomes |
---|---|---|---|
Abidine et al. [56] (2015) | Recognition of activities of daily living | The work proposed principal component analysis, independent component analysis, and linear discriminant analysis features with weighted support vector machines. The work also applied the features with other machine learning algorithms such as conditional random fields. | 94% accuracy. |
Aertssen et al. [57] (2011) | Recognition of activities of daily living | Motion information was extracted using motion history images and analyzed to detect three different actions for elderly people: walking, bending, and getting up. Shape deformations of the motion history images were investigated for different activities and used later for comparison in-room monitoring. | 94% accuracy. |
Auvinet et al. [58] (2008) | Fall detection | One of the authors of the work performed the falls on a mattress in a laboratory. The work mainly focused on post-fall phase. Twenty-two fall events were recorded for the experiments. | Analytical study of the proposed design was done rather than reporting accuracy. |
Auvinet et al. [59] (2011) | Fall detection | The authors first recorded a dataset of videos from eight different cameras installed around the room where falls were simulated with the help of a neuropsychologist. For testing, some fake falls were also recorded. | 100% accuracy. |
Belshaw et al. [60] (2011) | Fall detection | Two in-home fall trials were done in two real living rooms. For each trial, the users performed simulated falls and real daily living behaviors for seven days. For the second trial, the users were instructed to simulate falls only and 11 simulated falls were done for seven days. | 100% sensitivity; 95% specificity. |
Belshaw et al. [61] (2011) | Fall detection | An annotated training set was designed with fall or no-fall. For experiments, three office rooms were set for recording training and testing videos of simulated falls over the course of three weeks. | 92% sensitivity; 95% specificity. |
Berlin & John [62] (2016) | Recognition of activities of daily living | Harris corner-based interest points and histogram-based features were applied with deep neural networks to recognize different human activities. The dataset consisted of six types of different activities: shake hands, hug, kick, point, punch, and push. | 95% accuracy. |
Brulin et al. [63] (2012) | Activity posture recognition | Fuzzy rules were applied to recognized different kind of postures: sitting, lying, squatting, and standing. | 74.29% accuracy. |
Chen et al. [64] (2016) | Recognition of activities of daily living | Action graph of skeleton-based features were extracted and applied with maximum likelihood estimation. Twenty different actions with 557 sequences were tried. The experiments included the cross-subject test where half of the subjects were applied for training and the rest for testing. The experiments were repeated 252 times with different folds. | 96.1% accuracy. |
Chia-Wen & Zhi-Hong [65] (2007) | Fall detection | The authors recorded a total of 78 videos for fall detection where 48 were used for training and 30 for testing. They focused on three feature parameters (i.e., the centroid of a silhouette, the highest vertical projection histogram, and the fall-down duration) to represent three different motion types (i.e., walk, fall, and squat). | 86.7% sensitivity; 100% specificity. |
Du et al. [66] (2015) | Recognition of activities of daily living | Skeleton data was extracted by sub networks and then applied with hierarchical bidirectional recurrent neural network. More than 7000 images were used to determine the postures from different activities such as undetermined, lying, squatting, sitting, and standing. | 100% accuracy. |
Foroughi et al. [67] (2008) | Fall and activities of daily living recognition | The authors applied best-fit approximation ellipse of silhouette, histograms, and temporal variations of head position features to represent daily activities and falls. Fifty subjects were used to record 10 activities five times each for experiments. | 97% accuracy. |
Huang et al. [68] (2016) | Recognition of activities of daily living | Lie group features were extracted and applied with Lie group network for different human activity recognition. The experiments included the largest 3D activity recognition dataset consisted of more than 56,000 sequences from 60 different activities performed by 40 different subjects. | 89.10% accuracy. |
Krekovic et al. [69] (2012) | Fall detection | The fall detection system consisted of background estimation, moving object extraction, motion feature extraction, and finally, fall detection. Dynamics of human motion and body orientation were focused. The small data set was built. | 90% accuracy. |
Lan et al. [70] (2015) | Recognition of activities of daily living | Dense activity trajectory was developed using histogram of oriented gradients and histogram of optical flow features to apply with support vector machines. The proposed method was validated on four different challenging datasets: Hollywood2, UCF101 and UCF50, and HMDB51. | 94.4% accuracy. |
Li et al. [71] (2016) | Recognition of activities of daily living | Vector of locally aggregated descriptor features were applied to analyze deep dynamics of the activities and later combined with deep convolutional neural networks. The proposed approach was tried on a public dataset of 16 different activities. | 90.81% accuracy. |
Li et al. [72] (2012) | Fall detection | The experimental dataset used in the work consisted two kinds of activities: falls and non-falls. The subjects were trained by nursing collaborators to act falling like an elderly. The first dataset was recorded in a laboratory where a mattress was used to fall on. The dataset consisted on 240 fall and non-fall videos (i.e., 120 for each). The second dataset was recorded in a realistic environment in four different apartments where each subject performed six falls on a mattress. | 100% sensitivity; 97% specificity. |
Lee & Mihailidis [73] (2005) | Fall detection | Trials for experimental analysis were done in a fake bedroom setting. The room consisted of a bed, a chair, and random bedroom furniture. The subjects were asked to complete five scenarios, which generated a total of 315 tasks consisting of 126 falls and 189 non-falls. | 77% accuracy. |
Lee & Chung [74] (2012) | Fall detection | Kinect depth camera with a laptop was installed to record a total 175 videos of different fall scenarios in indoor environments. | 97% accuracy. |
Leone et al. [75] (2011) | Fall detection | A geriatrician provided instructions for the simulation of falls which were performed using crash mats and knee or elbow protectors. A total amount of 460 videos were simulated of which 260 were falls. Several activities of daily living were stimulated other than falls to evaluate the ability of discriminating falls from activities of daily living. | 97.3% sensitivity; 80% specificity. |
Mirmahboub et al. [76] (2013) | Fall detection | The experimental dataset consists of 24 scenarios. In each scenario, a subject performed activities such as falling, sitting on a sofa, walking, and pushing objects. All activities were performed by one subject with different dresses. | 95.2% accuracy. |
Mo et al. [77] (2016) | Recognition of activities of daily living | Robust features were automatically extracted from body skeletons. The features were then applied with deep convolutional neural networks for modeling and recognition of 12 different daily activities. | 81.8% accuracy. |
Nyan et al. [78] (2008) | Fall detection | A total of 20 sets of data were recorded for different activities such as forward fall, backward fall, sideways fall, fall to half-left, and fall to half-right. Subjects were also asked to simulate activities of daily livings. | 100% accuracy. |
Peng et al. [79] (2014) | Recognition of activities of daily living | Space-time interest points, histogram of oriented gradients, and histogram of optical flow features were applied with support vector machines. The proposed approach was tried on three different realistic datasets: UCF50, UCF101, and HMDB51. | 92.3% accuracy. |
Peng et al. [80] (2014) | Recognition of activities of daily living | Robust dense trajectories were encoded with stacked Fisher kernels and applied with support vector machines for activity recognition. The approach was tried on three large datasets collected from different sources such as YouTube. | 93.38% accuracy. |
Rougier et al. [81] (2011) | Fall detection | Shape matching technique was applied was used to track a silhouette from a video sequence. Then, Gaussian mixture model was used for fall detection. | 100% accuracy. |
Shahroudy et al. [82] (2015) | Recognition of activities of daily living | Robust features were extracted using histogram of oriented gradients and histogram of optical flows. The features were then applied with support vector machines. The method was evaluated on three datasets: MSR-DailyActivity, MSR-Action3D, and 3D-ActionPairs dataset. | 81.9% accuracy |
Shi et al. [83] (2016) | Recognition of activities of daily living | Three sequential deep trajectory descriptors were tried with deep recurrent neural networks and convolutional neural networks for efficient activity recognition. The approach was tried on three datasets: KTH, HMDB51, and UCF101. | 96.8% accuracy. |
Shieh & Huang [84] (2012) | Fall detection | Subjects were requested to perform different events of falls and non-falls. The non-fall events include walking, running, sitting, and standing. The fall events include slipping, tripping, bending and fainting in any directions. In the experimental analysis, a total of 60 and 40 videos were used for non-fall and fall, respectively. | 90% accuracy. |
Simonyan & Zisserman [85] (2014) | Recognition of activities of daily living | Optical flow based temporal streams were applied with deep convolutional neural networks to model different human activities. The method was tried on two different datasets of benchmarks where it showed competitive performance with the state of the art methods. | 88.0% accuracy |
Uddin. [86] (2017) | Recognition of activities of daily living | Body parts in the depth images were first segmented based on random forests. Then, body skeletons were obtained from the segmented body parts. Furthermore, the robust spatiotemporal features were extracted and applied with hidden Markov models. The approach was tried on a public dataset of 12 human activities to check its robustness. | 98.27% accuracy. |
Uddin et al. [87] (2017) | Recognition of gaits | Spatiotemporal features were extracted using local directional edge patterns and optical flows. Then, deep convolutional neural networks were applied on them for normal and abnormal gait recognition. | 98.5% accuracy. |
Uddin et al. [88] (2017) | Recognition of activities of daily living | Body parts were segmented to get skeletons in the depth images based on random features and forests. Furthermore, spatiotemporal features were extracted based on the skeleton joint position and motion in consecutive frames. The body limbs were represented in spherical coordinate system to obtain person independent body features. Finally, the features were applied with deep convolutional neural networks on a public activity dataset of 12 different activities. | 98.27% accuracy. |
Veeriah et al. [89] (2015) | Recognition of activities of daily living | Normalized pair-wise angles, offset of joint positions, histogram of the velocity, and pairwise joint distances were applied with differential recurrent neural network. The approach was applied to recognize activities in two public datasets: MSR-Action3D and KTH. | 93.96% accuracy. |
Wang et al. [90] (2014) | Recognition of activities of daily living | Local occupancy patterns were applied to obtain depth maps. Fourier temporal pyramid was used for temporal representations of activities. Finally, the features were applied on support vector machines to characterize 12 different activities in a public dataset. | 97.06% accuracy. |
Wang et al. [91] (2016) | Recognition of activities of daily living | Weighted hierarchical depth motion maps were applied on three-channel deep convolutional neural networks. The method was applied on four different public datasets: MSRAction3D, MSRAction3DExt, UTKinect-Action, and MSRDailyActivity3D. | 100% accuracy. |
Wang et al. [92] (2015) | Recognition of activities of daily living | Pseudo-color images on three-channel deep convolutional neural networks were utilized to recognize activities on four public datasets (i.e., MSRAction3D, MSRAction3DExt, UTKinect-Action, and MSRDailyActivity3D) where it achieved the state-of-the-art results. | 100% accuracy. |
Wang et al. [93] (2015) | Recognition of activities of daily living | Skeleton-based robust features were applied with support vector machines. The approach was evaluated on two challenging datasets (i.e., HMDB51 and UCF101) where it outperformed the conventional approaches. | 91.5% accuracy. |
Willems et al. [94] (2009) | Fall detection | Grayscale video processing algorithm was applied to detect falls in the video. Background subtraction, shadow removal, ellipse fitting, and fall detection were done based on fall angle and aspect ratio. Finally, fall confirmation was done considering vertical projection histograms. | 85% accuracy. |
Yang et al. [95] (2017) | Recognition of activities of daily living | Low-level polynormal was assembled from local neighboring hypersurface normal and then aggregated by super normal vectors with linear classifier. The proposed method outperformed other traditional approaches on four public datasets: MSRActionPairs3D, MSRAction3D, MSRDailyActivity3D, and MSRGesture3D. | 100% accuracy. |
Yu et al. [96] (2012) | Fall detection | Simulating postures, activities, and falls in a laboratory setting. | 97.08% accuracy. |
Zhen et al. [97] (2016) | Recognition of activities of daily living | Space-time interest points with histogram of oriented gradient features were encoded with various encoding methods and then applied with support vector machines. The methods were tried on three public datasets: KTH, UCF-YouTube, and HMDB51. | 94.1% accuracy. |
Zhu et al. [98] (2016) | Recognition of activities of daily living | Co-occurrence features of skeleton joints were extracted and applied with deep recurrent neural networks with long short-term memory. The proposed method was validated on three different benchmark activity datasets: SBU kinect interaction, HDM05, and CMU. | 100% accuracy. |
Research Authors (Year) | Purpose | Characteristics | Outcomes |
---|---|---|---|
Arcelus et al. [99] (2009A) | Sit-to-stand transfer detection | Pressure sensor arrays were installed in a bed and floor. Then, pressure information over time was analyzed. The motion of the center of pressure was observed in the wavelet domain to determine whether a transfer occurred. | Older adults generated shorter sit-to-stand durations of approximately 2.88 s. |
Arcelus et al. [100] (2009B) | Sit-to-stand and stand-to-sit transfer detection. | Pressure sensors were installed in the toilet on the armrests of the commode. Clinical parameters were successfully obtained from several stand-to-sit and sit-to-stand transfers. Elderly people were included in the experiments as subjects. | Clinical parameters were successfully obtained for characterizing sit-to-stand and stand-to-sit transfer sequences. Older adults took longer and used less force in both cases. |
Arcelus et al. [101] (2010) | Sit-to-stand and stand-to-sit transfer detection in bedroom and toilet. | The work focused on the analysis of sit-to-stand and stand-to-sit transfers that were performed by the occupant in the bedroom and bathroom. Pressure sensors were installed in a bed and the grab bars of a toilet commode. Then, clinical feature extraction was performed to determine a warning level. | The clinically relevant features that were obtained from both bed-exits and grab bar usage showed differences between healthy adults and those with impaired mobility. The functionality of the proposed system in keeping track of potential warning signs was demonstrated. |
Research Authors (Year) | Purpose | Characteristics | Outcomes |
---|---|---|---|
Fleury et al. [102] (2008) | Walking, bending, and sitting recognition | The work proposed stimulating activities in a laboratory setting. The case study considered one day of monitoring. | 100% accuracy. |
Khan et. al [103] (2015) | Fall detection | The proposed research work developed a fall detection system based on acoustic signals collected from elderly people while performing their normal activities. The authors constructed a data description model using source separation technique, Mel-frequency cepstral coefficient, and support vector machine to detect falls. The dataset used in the work consisted of 30 fall activities and 120 non-fall activities. | 100% accuracy |
Li et al. [104] (2010) | Fall detection | The proposed work presented an eight-microphone circular array for person tracking and fall detection. For the sound classification, the authors applied Mel-frequency cepstral coefficients. Main design features of the array were obtained by utilizing a simulation toolbox in MATLAB. | 100% accuracy |
Li et al. [105] (2011) | Fall detection and localization | The authors proposed an approach for improving the accuracy of acoustic fall detection based on sliding window position and duration in data. The authors found that by positioning the window at the starting position of the signal, the highest sound source localization performance was achieved. This work applied the Hilbert transform by using a finite impulse response filter on the signals. | 100% accuracy. |
Popescu et al. [106] (2008) | Fall detection | Five different types of falls were targeted for experiments. A nurse was assigned to direct the subjects during the recording sessions of falls. The experimental dataset consisted of six different sessions with 23 falls in total. | 100% accuracy. |
Popescu & Mahnot [107] (2009) | Fall detection | The proposed work investigated a one-class classifier that required only examples from one class (i.e., fall sounds) for training. Then, fall detection was carried out based on that training. | 100% accuracy. |
Vacher et al. [108] (2011) | Recognition of activities of daily living | The work proposed Gaussian mixture models and support vector machines for daily activity recognition. The system also tried to recognize significant events rather than daily activities. | 92% accuracy. |
Zhuang et al. [109] (2009) | Fall detection | The author presented a fall detection system that used only the audio signal of the microphone. The system modeled each fall segment using a Gaussian mixture model super vector. A support vector machine was combined with the model supervisors to classify audio segments into falls. | 64% accuracy. |
Research Authors (Year) | Purpose | Characteristics | Outcomes |
---|---|---|---|
Alwan et al. [110] (2006) | Fall detection | The authors used dummies of humans to simulate original fall events. The experimental tests of falls were performed on concrete floors. A dummy was used to emulate the scenario of a person falling while attempting to get out of a chair. Another dummy was used to emulate the scenario of a person falling with upright position. The experiments were repeated three times with the same dummies. | 100% accuracy. |
Lombardi et al. [111] (2015) | Movement detection | A data model was proposed for storing and processing floor data. The proposed approach focused on estimating the center of floor pressure based on the widely used biomechanical concept of ground reaction force. Some practical tests on a real sensing floor prototype were attempted. The novel approach outperformed the traditional background subtraction schemas for the correct detection and tracking of people. | 97% accuracy. |
Serra et al. [112] (2014) | Footstep recognition | An easy-to-install and unobtrusive smart flooring system was proposed based on piezoelectric polymer floor sensors. The smart flooring system was utilized for efficient human footsteps recognition based on the Pearson product–moment correlation coefficient between the testing and reference signals for similarity calculation. | 99% accuracy. |
Research Authors (Year) | Purpose | Characteristics | Outcomes |
---|---|---|---|
Forouzanfar et al. [113] (2017) | Event recognition, such as breathing and human motion | The work proposed a methodology for classifying different events, such as breathing. Many time- and frequency-domain features were derived from radar signals. Then, linear discriminant analysis was performed to reduce the dimension of the candidate feature set. Finally, Bayesian classifiers were used to detect the target events. | Breathing: 90% accuracy. Motion: 93% accuracy. |
Kim and Toomajian [114] (2016) | Gesture recognition | The work applied a deep convolutional neural network for hand gesture recognition using micro-doppler signatures. Ten different hand gestures were recognized using short-time fast Fourier transform features of the radar signals. | 93.1% accuracy. |
Lien et al. [115] (2016) | Gesture recognition | The work utilized a millimeter-wave radar to develop a novel, robust, high-resolution, and low-power gesture sensing technology. The overall system consisted of radar design principles, high-temporal-resolution hand tracking, a hardware abstraction layer, a radar chip, interaction models, and gesture vocabularies. The system can track gestures at 10,000 frames per second. | 98% accuracy. |
Rui et al. [116] (2017) | Walking speed estimation | The paper proposed an algorithm for estimating the walking speed of a human using a doppler radar system. The system was designed with the aim of passive gait assessment of elderly people. Furthermore, the work analyzed zero-crossing periods of the radar signals in the time domain to improve the dynamics of the gait signature. | 97% accuracy. |
Wan et al. [117] (2014) | Gesture recognition | A gesture recognition system was proposed, which was based on portable smart radar sensors with high accuracy for differentiating different types of hand and head movements. The authors adopted principle component analysis in the time and frequency domains to analyze two different sets of gestures. | 100% accuracy. |
Research Authors (Year) | Purpose | Characteristics | Outcomes | Sensors |
---|---|---|---|---|
Alwan et al. [118] (2006) | Recognition of activities of daily living | Systems for detecting activities of daily living were installed in 15 assisted living units. The reports were sent to professional caregivers of the residents. Fifteen residents and six caregivers participated in the system. It was a pilot study in which monitoring was performed for three months. Quality of life was assessed using a standard satisfaction-with-life scale instrument. | There was a high acceptance rate of the system. The approach could be used for improved healthcare planning and detection of health status changes. | PIR motion sensors, stove sensor, bed pressure sensor. |
Alwan et al. [119] (2006) | Recognition of activities of daily living | Activities of daily living were monitored for 26 elderly residents and 25 caregivers over four months. A standard satisfaction-with-life scale instrument was used to assess the quality of life of the elderly people and the caregivers. | Once four months of monitoring were finished, there was no significant difference in the quality-of-life scores of the elderly users and the caregivers. The system seemed to be highly acceptable. | PIR motion sensors, stove sensor, bed pressure sensor. |
Alwan et al. [120] (2007) | Recognition of activities of daily living | The purpose of the work was to assess the impact of passive health status in assisted living. Two aspects were analyzed: the cost of care and the efficiencies of caregivers. Activities of daily living systems were monitored for 21 residents for over three months. | The study demonstrated that the monitoring technologies that were used in the work significantly reduced billable interventions, hospital days, and cost of care to players. Moreover, they had a positive impact on professional caregivers’ efficiency. | PIR motion sensors, stove sensor, pressure sensors. |
Ariane et al. [121] (2012) | Fall detection | The proposed fall detection system was simulated by testing on scenarios in an existing data set. | 89.33% accuracy. | PIR motion sensors, pressure mats. |
Bemis et al. [122] (2008) | Recognition of activities of daily living | It was a case study on two residences based on seven and four months of monitoring. | The functionality of the system in detecting activities and deviations in patterns of activities was described. | Video monitoring, PIR motion sensors. |
Bemis et al. [123] (2010) | Recognition of activities of daily living | The work reported the progress in sensors, middleware, and behavior interpretation mechanisms, spanning from simple rule-based alerts to algorithms for extracting the temporal routines of the users. | The functionality of the system was demonstrated. | Video monitoring, PIR motion sensors. |
Celler et al. [124] (1996) | Recognition of activities of daily living | The work presented a smart home monitoring system that was based on sequences of pressure. It mainly focused on pressure transfers in the bedroom and bathroom to check whether the motion evaluation is in the normal range or not. | The functionality of the system was demonstrated. The system showed encouraging results for precise fine-grained activity monitoring systems, especially using high-precision user localization sensors. | PIR motion sensors, sound sensors, temperature sensors, light sensors, pressure sensors. |
Chung et al. [125] (2017) | Sleep stage classification | A novel approach was proposed for sleep stage classification using a doppler radar and a microphone. The classification algorithm was designed based on a standard polysomnography reference-based database and medical knowledge of doctors and sleep technologists at a hospital. The algorithm outperformed commercially available products for a specific database. | 100% accuracy. | Doppler radar and microphone. |
Guettari et al. [126] (2010) | Localization | This work proposed a localization system that was based on a combination of infrared sensors and sound sensors. The system mainly used the azimuth angles of the sources. This multimodal system improved the precision of localization compared to a standalone system. | 54% improvement was achieved using the proposed multimodal system compared to a standalone one. | PIR motion sensors and sound sensors. |
Kinney et al. [127] (2004) | Recognition of activities of daily living | It was a pilot study on 19 families for activity monitoring. Monitoring was performed for six months. | The main advantage of the system was the ease of tracking the users. The main disadvantage was the annoyance that was created by false alerts. The cost was $400 to equip the home. Ninety dollars per month was the cost of maintenance. | Video camera, PIR motion sensors. |
Lotfi et al. [128] (2011) | Recognition of activities of daily living | It was a case study on two dementia patients. The first patient was monitored for 20 days. The second patient was monitored for 18 months. | The system was used to identify abnormal behavior. The system demonstrated satisfactory performance in identifying health status using different ambient sensors. | PIR motion sensors, door opening sensors, flood sensors. |
Rantz et al. [129] (2008) | Fall detection | A case study was performed for retrospective analysis of fall detection data. | A change of health status was detected by the system but ignored by the nurses. | Video camera, PIR motion sensors, bed pressure sensors, door sensors. |
Van Hoof et al. [130] (2011) | Recognition of activities of daily living | It was a pilot study for daily activity monitoring and fire wandering detection. The system was installed in the range of 8–23 months for analysis. | Use of the proposed system improved the sense of safety and security. | PIR motion sensors, video camera. |
Zhou et al. [131] (2011) | Recognition of activities of daily living | The work tried to recognize simulated activities that were monitored in testbed for a month. | 92% precision; 92% recall. | Video camera, PIR motion sensors. |
Zouba et al. [132] (2009) | Recognition of activities of daily living | The authors recognized simulated activities that were monitored in a laboratory setting. | 62–94% precision; 62–87% sensitivity. | Video camera, PIR motion sensors. |
Zouba et al. [133] (2009) | Recognition of activities of daily living | The work was focused on monitoring simulated activities in a laboratory setting. | 50–80% precision; 66–100% sensitivity. | Video camera, PIR motion sensors. |
Research Authors (Year) | Purpose | Characteristics | Outcomes | Sensors |
---|---|---|---|---|
Aghajan et al. [134] (2007) | Significant event detection | A sensor network that consisted of various types of sensors was used. Based on sensor data, event detection modalities with distributed processing were applied for smart home applications. More specifically, a distributed vision-based analysis was carried out for the detection of the occupant’s posture. Then, features from multiple cameras were combined via a rule-based approach for significant event detection. | 96.7% accuracy. | Accelerometer sensors, video camera, PIR motion sensors. |
Bang et al. [135] (2008) | Recognition of activities of daily living | An accelerometer and environmental-sensor-based approach was proposed. Conditional probabilities were used for recognition of daily activities that combine human motion and contacts with objects. | 97% accuracy. | Accelerometer sensors, environmental sensors, PIR motion sensors. |
Bianchi et al. [136] (2009) | Fall detection | The study was to evaluate barometric pressure along with accelerometer-based fall detection. Signal processing techniques (e.g., signal magnitude area) and a classification algorithm (support vector machines) were used to discriminate falls from typical daily activities. | 97.5% accuracy. | Accelerometer sensors and barometric pressure sensors. |
Cao et al. [137] (2009) | Recognition of activities of daily living | An event-driven context-aware computing model was proposed for recognizing daily activities. | Elderly health monitoring through the proposed system showed the effectiveness of the proposed model. | Video camera, accelerometer sensors. |
Hein et al. [138] (2010) | Recognition of activities of daily living | A two-fold approach was described. First, sensors were selected based on interviews of elderly people, their relatives, and caregivers. Then, based on the outcome of the interviews, a sensor-based system was utilized to recognize different daily human activities. | Maximum 96.1% sensitivity and 90.3% specificity. | Accelerometer sensors, video camera, PIR motion sensors, door sensors. |
Medjahed et al. [139] (2009) | Recognition of activities of daily living | A fuzzy-logic-based approach was proposed for robust human activity recognition on simulated data. | 97% accuracy. | Sound sensors, PIR motion sensors, physiological sensors, state-change sensors. |
Nyan et al. [140] (2006) | Fall detection | A fall detection approach was proposed using gyroscopes. Angles from different sides were explored for accurately modelling fall detection. | Maximum 100% sensitivity and 97.5% specificity. | Gyroscopes sensors, video camera. |
Roy et al. [141] (2011) | Recognition of activities of daily living | This work proposes a framework of daily activity recognition that uses possibility theory and description logic-based semantic modeling. Different machine learning approaches (e.g., Gaussian mixture models, hidden Markov models, deep belief network) were analyzed. | 95% accuracy. | Pressure sensors, accelerometer sensors, video sensors, PIR motion sensors. |
Sim et al. [142] (2011) | Recognition of activities of daily living | The work applied mining of correlated patterns in activity recognition systems. | The correlated activity pattern mining approach showed 35.5% higher accuracy than typical frequent mining systems. | RFID sensors, accelerometer sensors, reed switches, PIR motion sensors, pressure sensors. |
Srinivasan et al. [143] (2007) | Fall detection | The system applied triaxial accelerometer and motion detector sensor data in a two-step fall detection algorithm. First, the system tried to detect falls using the normalized energy expenditure from acceleration values. Then, falls were confirmed by considering the absence of motion. Some thresholds and logic were used to detect falls. | 100% accuracy for coronal falls and 94.44% sagittal falls. | Accelerometer sensors, PIR motion sensors. |
Tolkiehn et al. [144] (2011) | Fall detection | The system used a 3D accelerometer and a barometric pressure sensor for robust fall detection, along with detection of the fall direction. The basic probability-based amplitude and angular features were obtained from accelerometer sensors. Later, a pressure threshold was used. | Maximum 89.97% accuracy for fall prediction and 94.12% for fall direction. | Accelerometer sensor, barometric pressure sensor. |
Robot | Characteristics | Ambient Sensors |
---|---|---|
AIBO [145] (2008) | It is a dog-like robot that is capable of facial expressions. The companion robot can display how it feels through six emotional states: happiness, dislike, anger, love, sadness, and surprise. It has touch sensors on the head, chin, and back. Stereo microphones allow it to hear. The camera helps it to see and balance. It also uses infrared, acceleration, and temperature sensors to adapt to its surroundings. | Touch sensors, video camera, distance sensor, microphone, temperature sensor. |
AILISA [146] (2005) | It is a machine-like robot. It provides mobility aid, physiological monitoring, and fall monitoring. | Motion sensors, wireless weight scale sensors. |
Cafero [147] | It is machine-like robot that provides helps with monitoring and recording vital signs, telepresence, cognitive training, entertainment and reminiscence, and scheduling activities. | Infrared sensor, camera sensor, laser range finder, sonar sensors. |
Care-O-bot [148] (2009) | It is a human-like robot. It provides aid with walking with navigation, fetching objects, security, monitoring health and personal safety, cleaning tasks, heating food, telepresence, and medication reminders. | 3-D time of flight cameras, stereo camera, microphone sensor on robot head, tactile sensors on robot hand |
GiraffPlus [149] (2014) | It is a machine-like robot with a touchscreen interface. It allows remote people (i.e., caregivers, family, and friends) to virtually visit an elderly person’s home, move the robot about freely, and communicate with the elderly person through video conferencing technology. | Passive infrared detector, electrical usage sensor, a pressure sensor (bed/sofa/chair), accelerometer sensor, physiological sensors (e.g., body weight, blood pressure, pulse rate). |
Hector [150] (2013) | It is a machine-like robot with a touchscreen interface. It provides aids for recording daily routines, controlling the environment, cognitive training, reminding to take medication, reviewing of daily agendas, detecting falls, and providing help during emergency. | Kinect depth camera, web camera, fisheye camera, microphone, IR motion sensor. |
Act [151] (2008) | It is a cat-like robot. Through a camera, it can recognize objects and faces. Microphones in it can recognize speech and the direction of the sound source. It can also sense touch through its touch sensors. | Video camera, touch sensor, microphone. |
NeCoRo [152] (2005) | It is a cat-like fluffy robot. The tactile sensors in its head, chin, and back can sense a stroking or patting. A microphone in its head can detect sound and the source of the sound. A camera helps it avoid obstacles. An acceleration sensor helps it recognize its position while spinning around. | Video camera, touch sensor, microphone, tactile sensor, position sensor, vision light sensor. |
Paro [153] (2016) | It is a seal-like fluffy robot that is capable of facial expressions. It can provide company to elderly people who are living alone. It can perceive its environment with the help of five types of sensors: temperature, tactile, light, sound, and posture sensors. Light sensors help it to recognize light and dark. Tactile sensors help it feel when it is being stroked or beaten. Posture sensors help it sense when it is being held. Sound sensors help it perceive the directions of voices and words. | Light sensor, auditory (determination of sound source direction and speech recognition) sensor, balance sensor, tactile sensor. |
Pearl [154] (2002) | It is a human-like robot with a head that is capable of facial expressions. It aids in the reminding of daily agendas, guiding around the home, reminding of appointments, telepresence, monitoring of health, and opening or closing the refrigerator. | Navigation sensors that use a laser range- finder, sonar sensors, microphones for speech recognition, stereo camera systems. |
Ri-man [155] (2007) | It is a human-like machine. It aids in lifting and carrying people. | Tactile sensors |
MOVAID [156] (2007) | It is machine-like robot with no head and approximately two meters tall. The robot was designed for heating/delivering food, changing bed linen, kitchen bench cleaning. | Ultrasound, force, camera, local positioning, laser, infrared, and tilt sensors. |
Guido [157] (2005) | It is a machine with the height of around one meter. This robot basically aids for walking and navigation. | Force, local positioning, and laser sensor. |
HOMIE [158] (2005) | This robot is a dog-like robot which was designed to give company to the elderly people. It can show some emotions and also, provide entertainment and medical attendance services. | Microphone, pressure, and motion sensors. |
Wakamaru [159] (2006) | It is a human-like robot where the head of the robot can perform some facial expressions. The functions of the robot include security, managing schedules, information service, face recognition, conversation, medication reminder, and reporting unusual situations. | Camera, ultrasonic, bumper, microphone, and step detection sensors. |
IRobiQ [160] (2010) | It is human-like Korean robot with a static face. The height of the robot is approximately 0.3 m. The robot helps the users with medication reminders, cognitive training, entertainment, telepresence communication, and vital signs monitoring. | Touch screen, microphone, infrared sensor, ultrasonic, and camera sensors. |
Ifbot [161] (2004) | It is human-like robot with a static face and height of approximately 0.3 m. The robot was designed for entertainment purposes, cognitive training, and basic health monitoring. | Ultrasonic, infrared, camera, microphone, touch, and shock sensors. |
Teddy [162] (2011) | It is a bear-like companion robot which can show different kinds of emotions via facial expressions. | Touch, microphone, and camera sensors. |
Huggable [163] (2006) | It is bear-like companion robot that shows some facial expressions. The simple robot can give and receive hugs. To create a sensitive skin of the robot, more than 1000 touch sensors are underneath the skin of the robot. | Cameras, touch, force, and micophone sensors. |
iCat [164] (2018) | It is cat-like companion robot with some facial expressions. The height of the robot is approximately 0.4 m. | Camera, microphone, and touch sensors. |
© 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Uddin, M.Z.; Khaksar, W.; Torresen, J. Ambient Sensors for Elderly Care and Independent Living: A Survey. Sensors 2018, 18, 2027. https://doi.org/10.3390/s18072027
Uddin MZ, Khaksar W, Torresen J. Ambient Sensors for Elderly Care and Independent Living: A Survey. Sensors. 2018; 18(7):2027. https://doi.org/10.3390/s18072027
Chicago/Turabian StyleUddin, Md. Zia, Weria Khaksar, and Jim Torresen. 2018. "Ambient Sensors for Elderly Care and Independent Living: A Survey" Sensors 18, no. 7: 2027. https://doi.org/10.3390/s18072027