Nothing Special   »   [go: up one dir, main page]

Skip to main content

Advertisement

Log in

A hybrid CNN and BLSTM network for human complex activity recognition with multi-feature fusion

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

A hybrid convolutional neural network (CNN) and bidirectional long short-term memory (BLSTM) network for human complex activity recognition with multi-feature fusion is proposed in this paper. Specifically, a new CNN model is designed to extract the spatial features from the sensor data. Considering that in the process of activity recognition, the output at the current moment is not only related to the previous state, but also to the subsequent state. BLSTM network is further used to extract the temporal context of state information to improve the performance of activity recognition. In order to fully mine the features from the sensor data and further improve the performance of activity recognition, a new feature selection method named SFSANW (sequential forward selection and network weights), which is based on sequential forward selection algorithm and network weights is proposed to select features extracted by the traditional methods to obtain dominant features. The dominant features are then fused with the feature vectors extracted by the hybrid CNN and BLSTM network. Experiments are performed on two complex activity datasets, PAMAP2 and UT-Data, and 92.23% and 98.07% F1 scores are obtained, respectively. The experimental results demonstrate that the proposed method can achieve better performance of complex activity recognition, which is superior to the traditional machine learning algorithms and the state-of-the-art deep learning algorithms.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

Availability of data and material

Data and material are fully available without restriction.

Code availability

Custom code is available without restriction.

References

  1. Blanke U, Schiele B (2009) Daily routine recognition through activity spotting. In Int Symp Loc Context Aware Springer 192–206

  2. Ciabattoni L, Foresi G, Monteriù A, Pagnotta DP, Romeo L, Spalazzi L, De Cesare A (2018) Complex activity recognition system based on cascade classifiers and wearable device data. In 2018 IEEE Int Conf Consum Electron (ICCE) IEEE 1–2

  3. Cvetković B, Mirchevska V, Janko V, Luštrek M (2015) Recognition of high-level activities with a smartphone. In Adjunct Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2015 ACM International Symposium on Wearable Computers, pp 1453–1461

  4. Damirchi H, Khorrambakht R, Taghirad H (2020) Arc-net: Activity recognition through capsules. arXiv preprin

  5. Dernbach S, Das B, Krishnan NC, Thomas BL, Cook DJ (2012) Simple and complex activity recognition through smart phones. In 2012 8th Int Conf Intell Environ IEEE 214–221

  6. Donahue J, Anne Hendricks L, Guadarrama S, Rohrbach M, Venugopalan S, Saenko K, Darrell T (2015) Long-term recurrent convolutional networks for visual recognition and description. In Proc IEEE Conf Comput Vis Pattern Recognit 2625–2634

  7. Edel M, Köppe E (2016) Binarized-blstm-rnn based human activity recognition. In 2016 Int Conf Indoor Position Indoor Navig (IPIN) IEEE 1–7

  8. Friday NH, Al-garadi MA, Mujtaba G, Alo UR, Waqas A (2018) Deep learning fusion conceptual frameworks for complex human activity recognition using mobile and wearable sensors. In 2018 Int Conf Comput Math Eng Technol (iCoMET) IEEE 1–7

  9. Gacav C, Benligiray B, Topal C (2016) Sequential forward feature selection for facial expression recognition. In 2016 24th Signal Process Commun Appl Conf (SIU) IEEE 1481–1484

  10. Gil-Martín M, San-Segundo R, Fernández-Martínez F, de Córdoba R (2020) Human activity recognition adapted to the type of movement. Comput Elect Eng 88:106822

  11. Guan Y, Plötz T (2017) Ensembles of deep lstm learners for activity recognition using wearables. Proc ACM Interact Mob Wearable Ubiquitous Technol 1(2):1–28

    Article  Google Scholar 

  12. Guiry JJ, Van de Ven P, Nelson J (2014) Multi-sensor fusion for enhanced contextual awareness of everyday activities with ubiquitous devices. Sensors 14(3):5687–5701

    Article  Google Scholar 

  13. Ha S, Choi S (2016) Convolutional neural networks for human activity recognition using multiple accelerometer and gyroscope sensors. In 2016 Int Joint Conf Neural Netw (IJCNN) IEEE 381–388

  14. Hammerla NY, Halloran S, Plötz T (2016) Deep, convolutional, and recurrent models for human activity recognition using wearables. In Proc 25th Int Joint Conf Artif Intell AAAI Press IJCAI’16 1533–1540

  15. Haque MN, Tonmoy MTH, Mahmud S, Ali AA, Khan MAH, Shoyaib M (2019) Gru-based attention mechanism for human activity recognition. In 2019 1st Int Conf Adv Sci Eng Robot Technol (ICASERT) IEEE 1–6

  16. Hernández F, Suárez LF, Villamizar J, Altuve M (2019) Human activity recognition on smartphones using a bidirectional lstm network. In 2019 XXII Symp Image Signal Proc Artif Vis (STSIVA) IEEE 1–5

  17. Huynh T, Schiele B (2005) Analyzing features for activity recognition. In Proceedings of the 2005 Joint Conference on Smart Objects and Ambient Intelligence: Innovative Context-Aware Services: Usages and Technologies 159–163

  18. Huynh T, Fritz M, Schiele B (2008) Discovery of activity patterns using topic models. In Proc 10th Int Conf Ubiquitous Comput 10–19

  19. Kim Y, Moon T (2015) Human detection and activity classification based on micro-doppler signatures using deep convolutional neural networks. IEEE Geosci Remote Sens Lett 13(1):8–12

    Article  Google Scholar 

  20. Lee SM, Yoon SM, Cho H (2017) Human activity recognition from accelerometer data using convolutional neural network. In 2017 IEEE Int Conf Big Data Smart Comput (BigComp) IEEE 131–134

  21. Liu L, Peng Y, Liu M, Huang Z (2015) Sensor-based human activity recognition system with a multilayered model using time series shapelets. Knowl Based Syst 90:138–152

    Article  Google Scholar 

  22. Liu L, Peng Y, Wang S, Liu M, Huang Z (2016) Complex activity recognition using time series pattern dictionary learned from ubiquitous sensors. Inf Sci 340:41–57

    Article  MathSciNet  Google Scholar 

  23. Liu J, Shahroudy A, Xu D, Wang G (2016b) Spatio-temporal lstm with trust gates for 3d human action recognition. In Eur Conf Comput Vis Springer 816–833

  24. Lv M, Xu W, Chen T (2019) A hybrid deep convolutional and recurrent neural network for complex activity recognition using multimodal sensors. Neurocomputing 362:33–40

    Article  Google Scholar 

  25. Mobark M, Chuprat S, Mantoro T (2017) Improving the accuracy of complex activities recognition using accelerometer-embedded mobile phone classifiers. In 2017 2nd Int Conf Inf Comput (ICIC) IEEE 1–5

  26. Murahari VS, Plötz T (2018) On attention models for human activity recognition. In Proc 2018 ACM Int Symp Wearable Comput 100–103

  27. Münzner S, Schmidt P, Reiss A, Hanselmann M, Stiefelhagen R, Durichen R (2017) CNN-based sensor fusion techniques for multimodal human activity recognition. In Proc 2017 ACM Int Symp Wearable Comput 158–165

  28. Ordóñez FJ, Roggen D (2016) Deep convolutional and lstm recurrent neural networks for multimodal wearable activity recognition. Sensors 16(1):115

    Article  Google Scholar 

  29. Peng L, Chen L, Wu X, Guo H, Chen G (2016) Hierarchical complex activity representation and recognition using topic model and classifier level fusion. IEEE Trans Biomed Eng 64(6):1369–1379

    Article  Google Scholar 

  30. Peng L, Chen L, Wu M, Chen G (2018) Complex activity recognition using acceleration, vital sign, and location data. IEEE Trans Mob Comput 18(7):1488–1498

    Article  Google Scholar 

  31. Reiss A, Hendeby G, Stricker D (2013) Confidence based multiclass adaboost for physical activity monitoring. In Proc 2013 Int Symp Wearable Comput 13–20

  32. Rueda FM, Grzeszick R, Fink GA, Feldhorst S, Hompel MT (2018) Convolutional neural networks for human activity recognition using body-worn sensors. Informatics 5(2):26

    Article  Google Scholar 

  33. Sabour S, Frosst N, Hinton GE (2017) Dynamic routing between capsules. In Adv Neural Inf Process Syst 3856–3866

  34. Saguna S, Zaslavsky A, Chakraborty D (2013) Complex activity recognition using context-driven activity theory and activity signatures. ACM Trans Comput Hum Interact (TOCHI) 20(6):1–34

    Article  Google Scholar 

  35. Seiter J, Amft O, Rossi M, Tröster G (2014) Discovery of activity composites using topic models: An analysis of unsupervised methods. Pervasive Mob Comput 15:215–227

    Article  Google Scholar 

  36. Shoaib M, Bosch S, Incel OD, Scholten H, Havinga PJ (2016) Complex human activity recognition using smartphone and wrist-worn motion sensors. Sensors 16(4):426

    Article  Google Scholar 

  37. Wan S, Qi L, Xu X, Tong C, Gu Z (2020) Deep learning models for real-time human activity recognition with smartphones. Mob Netw Appl 25(2):743–755

    Article  Google Scholar 

  38. Yang Z, Raymond OI, Zhang C, Wan Y, Long J (2018) Dfternet: Towards 2-bit dynamic fusion networks for accurate human activity recognition. IEEE Access 6:56750–56764

    Article  Google Scholar 

  39. Yu T, Chen J, Yan N, Liu X (2018a) A multi-layer parallel lstm network for human activity recognition with smartphone sensors. In 2018 10th Int Conf Wireless Commun Signal Process (WCSP) IEEE 1–6

  40. Yu S, Qin L, Yin Q (2018b) A c-lstm neural network for human activity recognition using wearables. In 2018 Int Symp Sens Instrum IoT Era (ISSI) IEEE 1–6

  41. Zeng M, Gao H, Yu T, Mengshoel OJ, Langseth H, Lane I, Liu X (2018) Understanding and improving recurrent networks for human activity recognition by continuous attention. In Proc 2018 ACM Int Symp Wearable Comput 56–63

  42. Zheng Z, Shi L, Wang C, Sun L, Pan G (2019) Lstm with uniqueness attention for human activity recognition. In Int Conf Artif Neural Netw Springer 498–509

  43. Zuo Z, Shuai B, Wang G, Liu X, Wang X, Wang B, Chen Y (2016) Learning contextual dependence with convolutional hierarchical recurrent neural networks. IEEE Trans Image Proces 25(7):2983–2996

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgements

This work was supported by the Zhejiang Provincial Natural Science Foundation of China [grant number LY19F020032], and National Natural Science Foundation of China [grant number 61872322, U1909203, 62036009].

Funding

This study was funded by the Zhejiang Provincial Natural Science Foundation of China (grant number LY19F020032), and National Natural Science Foundation of China (grant number 61872322, U1909203, 62036009).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ruohong Huan.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Huan, R., Zhan, Z., Ge, L. et al. A hybrid CNN and BLSTM network for human complex activity recognition with multi-feature fusion. Multimed Tools Appl 80, 36159–36182 (2021). https://doi.org/10.1007/s11042-021-11363-4

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-021-11363-4

Keywords

Navigation