Abstract
Skeleton-based motion capture (MoCap) systems have been widely used in the game and film industry for mimicking complex human actions for a long time. MoCap data has also proved its effectiveness in human activity recognition tasks. However, it is a quite challenging task for smaller datasets. The lack of such data for industrial activities further adds to the difficulties. In this work, we have proposed an ensemble-based machine learning methodology that is targeted to work better on MoCap datasets. The experiments have been performed on the MoCap data given in the Bento Packaging Activity Recognition Challenge 2021. Bento is a Japanese word that resembles lunch-box. Upon processing the raw MoCap data at first, we have achieved an astonishing accuracy of 98% on tenfold cross-Validation and 82% on leave-one-out cross-validation by using the proposed ensemble model.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Óscar D. Lara, Labrador, M.A.: A survey on human activity recognition using wearable sensors. IEEE Commun. Surveys Tutorials 15, 1192–1209 (2013). https://doi.org/10.1109/SURV.2012.110112.00192
Cippitelli, E., Gasparrini, S., Gambi, E., Spinsante, S.: A human activity recognition system using skeleton data from rgbd sensors. Comput. Intell. Neurosc. 2016 (2016). https://doi.org/10.1155/2016/4351435
Núñez, J.C., Cabido, R., Pantrigo, J.J., Montemayor, A.S., Vélez, J.F.: Convolutional neural networks and long short-term memory for skeleton-based human activity and hand gesture recognition. Pattern Recogn. 76, 80–94 (2018). https://doi.org/10.1016/J.PATCOG.2017.10.033
Sarker, S., Rahman, S., Hossain, T., Ahmed, S.F., Jamal, L., Ahad, M.A.R.: Skeleton-Based Activity Recognition: Preprocessing and Approaches, pp. 43–81. Springer International Publishing (2021). https://doi.org/10.1007/978-3-030-68590-4_2
Zhu, W., Lan, C., Xing, J., Zeng, W., Li, Y., Shen, L., Xie, X.: Co-occurrence feature learning for skeleton based action recognition using regularized deep lstm networks. In: Proceedings of the AAAI Conference on Artificial Intelligence 30 (2016). https://ojs.aaai.org/index.php/AAAI/article/view/10451
Ahad, M.A.R., Ahmed, M., Antar, A.D., Makihara, Y., Yagi, Y.: Action recognition using kinematics posture feature on 3d skeleton joint locations. Pattern Recogn. Lett. 145, 216–224 (2021). https://doi.org/10.1016/J.PATREC.2021.02.013
Cooking activity recognition challenge. https://abc-research.github.io/cook2020/ (2020). Accessed: 21 Aug 2021
Basak, P., Tasin, S.M., Tapotee, M.I., Sheikh, M.M., Sakib, A.H., Baray, S.B., Ahad, M.A.: Complex nurse care activity recognition using statistical features. In: UbiComp/ISWC 2020 Adjunct—Proceedings of the 2020 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2020 ACM International Symposium on Wearable Computers, pp. 384–389 (2020). https://doi.org/10.1145/3410530.3414338
Adachi, K., Alia, S.S., Nahid, N., Kaneko, H., Lago, P., Inoue, S.: Summary of the bento packaging activity recognition challenge. In: The 3rd International Conference on Activity and Behavior Computing (ABC2021) (2021)
Alia, S.S., Adachi, K., Nahid, N., Kaneko, H., Lago, P., Inoue, S.: Bento packaging activity recognition challenge (2021). https://doi.org/10.21227/cwhs-t440
Picard, C., Janko, V., Reščič, N., Gjoreski, M., Luštrek, M.: Identification of cooking preparation using motion capture data: A submission to the cooking activity recognition challenge. Smart Innovation, Syst. Technol. 199, 103–113 (2021). https://doi.org/10.1007/978-981-15-8269-1_9
Chen, Z., Hu, H., Li, Z., Qi, X., Zhang, H., Hu, H., Chang, V.: Skeleton-based action recognition for industrial packing process. In: IoTBDS 2020—Proceedings of the 5th International Conference on Internet of Things, Big Data and Security pp. 36–45 (2020). https://doi.org/10.5220/0009340800360045
Hossain, T., Sarker, S., Rahman, S., Ahad, M.A.R.: Skeleton-based human action recognition on large-scale datasets. Intell. Syst. Ref. Libr. 207, 125–146 (2021). https://doi.org/10.1007/978-3-030-75490-7_5
Motion capture analysis software. https://motionanalysis.com/movement-analysis/ (2021). Accessed: 21 Aug 2021
Suto, J., Oniga, S., Sitar, P.P.: Comparison of wrapper and filter feature selection algorithms on human activity recognition. In: 2016 6th International Conference on Computers Communications and Control, ICCCC 2016 pp. 124–129 (2016). https://doi.org/10.1109/ICCCC.2016.7496749
Meredith, M., Maddock, S.: Motion capture file formats explained. Production (2001)
Nguyen, T.T., Huang, J.Z., Nguyen, T.T.: Unbiased feature selection in learning random forests for high-dimensional data. Scientific World J. 2015 (2015). https://doi.org/10.1155/2015/471371
Bayat, A., Pomplun, M., Tran, D.A.: A study on human activity recognition using accelerometer data from smartphones. Proced. Comput. Sci. 34, 450–457 (2014). https://doi.org/10.1016/J.PROCS.2014.07.009
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Appendix
Appendix
See Table 3.
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Nazmus Sakib, A.H.M., Basak, P., Doha Uddin, S., Mustavi Tasin, S., Ahad, M.A.R. (2022). Can Ensemble of Classifiers Provide Better Recognition Results in Packaging Activity?. In: Ahad, M.A.R., Inoue, S., Roggen, D., Fujinami, K. (eds) Sensor- and Video-Based Activity and Behavior Computing. Smart Innovation, Systems and Technologies, vol 291. Springer, Singapore. https://doi.org/10.1007/978-981-19-0361-8_10
Download citation
DOI: https://doi.org/10.1007/978-981-19-0361-8_10
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-19-0360-1
Online ISBN: 978-981-19-0361-8
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)