Nothing Special   »   [go: up one dir, main page]

skip to main content
research-article
Open access

Unsupervised Factory Activity Recognition with Wearable Sensors Using Process Instruction Information

Published: 21 June 2019 Publication History

Abstract

This paper presents an unsupervised method for recognizing assembly work done by factory workers by using wearable sensor data. Such assembly work is a common part of line production systems and typically involves the factory workers performing a repetitive work process made up of a sequence of manual operations, such as setting a board on a workbench and screwing parts onto the board. This study aims to recognize the starting and ending times for individual operations in such work processes through analysis of sensor data collected from the workers along with analysis of the process instructions that detail and describe the flow of operations for each work process. We propose a particle-filter-based factory activity recognition method that leverages (i) trend changes in the sensor data detected by a nonparametric Bayesian hidden Markov model, (ii) semantic similarities between operations discovered in the process instructions, (iii) sensor-data similarities between consecutive repetitions of individual operations, and (iv) frequent sensor-data patterns (motifs) discovered in the overall assembly work processes. We evaluated the proposed method using sensor data from six workers collected in actual factories, achieving a recognition accuracy of 80% (macro-averaged F-measure).

References

[1]
Mario Aehnelt and Sebastian Bader. 2015. Information assistance for smart assembly stations. In the 7th International Conference on Agents and Artificial Intelligence (ICAART 2015), Vol. 2. 143--150.
[2]
Mario Aehnelt, Enrico Gutzeit, and Bodo Urban. 2014. Using activity recognition for the tracking of assembly processes: Challenges and requirements. In WOAR 2014. 12--21.
[3]
Mario Aehnelt and Bodo Urban. 2015. The knowledge gap: providing situation-aware information assistance on the shop floor. In HCI in Business. 232--243.
[4]
Mario Aehnelt and Karoline Wegner. 2015. Learn but work!: towards self-directed learning at mobile assembly workplaces. In the 15th International Conference on Knowledge Technologies and Data-driven Business. 17.
[5]
Sebastian Bader, Frank Krüger, and Thomas Kirste. 2015. Computational causal behaviour models for assisted manufacturing. In the 2nd international Workshop on Sensor-based Activity Recognition and Interaction. 14.
[6]
Ling Bao and Stephen S Intille. 2004. Activity recognition from user-annotated acceleration data. In Pervasive 2004. 1--17.
[7]
Martin Bauer, Lamine Jendoubi, and Oliver Siemoneit. 2004. Smart Factory--Mobile Computing in Production Environments. In the MobiSys 2004 Workshop on Applications of Mobile Embedded Systems (WAMES 2004).
[8]
Mark Blum, Alex Sandy Pentland, and Gehrard Tröster. 2006. Insense: Interest-based life logging. IEEE Multimedia 13, 4 (2006), 40--48.
[9]
Jianfeng Chen, Alvin Harvey Kam, Jianmin Zhang, Ning Liu, and Louis Shue. 2005. Bathroom activity monitoring based on sound. In Pervasive 2005. 47--61.
[10]
Arnaud Doucet. 2001. Sequential Monte Carlo methods. Wiley Online Library.
[11]
Tâm Huynh, Mario Fritz, and Bernt Schiele. 2008. Discovery of activity patterns using topic models. In UbiComp 2008. 10--19.
[12]
Matthew J. Johnson and Alan S. Willsky. 2013. Bayesian Nonparametric Hidden Semi-Markov Models. Journal of Machine Learning Research 14, 1 (2013), 673--701.
[13]
Oliver Korn, Albrecht Schmidt, and Thomas Hörz. 2013. The potentials of in-situ-projection for augmented workplaces in production: a study with impaired persons. In CHI'13 Extended Abstracts. 979--984.
[14]
Joseph Korpela, Ryosuke Miyaji, Takuya Maekawa, Kazunori Nozaki, and Hiroo Tamagawa. 2015. Evaluating tooth brushing performance with smartphone sound data. In UbiComp 2015. 109--120.
[15]
Joseph Korpela, Kazuyuki Takase, Takahiro Hirashima, Takuya Maekawa, Julien Eberle, Dipanjan Chakraborty, and Karl Aberer. 2015. An energy-aware method for the joint recognition of activities and gestures using wearable sensors. In International Symposium on Wearable Computers (ISWC 2015). 101--108.
[16]
Heli Koskimäki, Ville Huikari, Pekka Siirtola, Perttu Laurinen, and Juha Röning. 2009. Activity recognition using a wrist-worn inertial measurement unit: A case study for industrial assembly lines. In 17th Mediterranean Conference on Control and Automation (MED 2009). 401--405.
[17]
Taku Kudo, Kaoru Yamamoto, and Yuji Matsumoto. 2004. Applying conditional random fields to Japanese morphological analysis. In the 2004 Conference on Empirical Methods in Natural Language Processing (EMNLP '04). 230--237.
[18]
Jonathan Lester, Tanzeem Choudhury, and Gaetano Borriello. 2006. A practical approach to recognizing physical activities. In Pervasive 2006. 1--16.
[19]
Jessica Lin, Eamonn Keogh, Stefano Lonardi, and Pranav Patel. 2002. Finding motifs in time series. In The 2nd Workshop on Temporal Data Mining. 53--68.
[20]
Bruno Lotter and Hans-Peter Wiendahl. 2013. Montage in der industriellen Produktion: Ein Handbuch für die Praxis. Springer-Verlag.
[21]
Dominik Lucke, Carmen Constantinescu, and Engelbert Westkämper. 2008. Smart factory-a step towards the next generation of manufacturing. In Manufacturing systems and technologies for the new frontier. Springer, 115--118.
[22]
Paul Lukowicz, Holger Junker, Mathias Stäger, Thomas von Bueren, and Gerhard Tröster. 2002. WearNET: A distributed multi-sensor system for context aware wearables. In Ubicomp 2002. 361--370.
[23]
Paul Lukowicz, Jamie A Ward, Holger Junker, Mathias Stäger, Gerhard Tröster, Amin Atrash, and Thad Starner. 2004. Recognizing workshop activity using body worn microphones and accelerometers. In Pervasive 2004. 18--32.
[24]
Takuya Maekawa, Yasue Kishino, Yasushi Sakurai, and Takayuki Suyama. 2013. Activity recognition with hand-worn magnetic sensors. Personal and ubiquitous computing 17, 6 (2013), 1085--1094.
[25]
Takuya Maekawa, Yasue Kishino, Yutaka Yanagisawa, and Yasushi Sakurai. 2012. Recognizing handheld electrical device usage with hand-worn coil of wire. In Pervasive 2012. 234--252.
[26]
Takuya Maekawa, Yasue Kishino, Yutaka Yanagisawa, and Yasushi Sakurai. 2012. WristSense: wrist-worn sensor device with camera for daily activity recognition. In 2012 IEEE International Conference on Pervasive Computing and Communications Workshops (PERCOM Workshops). 510--512.
[27]
Takuya Maekawa, Daisuke Nakai, Kazuya Ohara, and Yasuo Namioka. 2016. Toward practical factory activity recognition: unsupervised understanding of repetitive assembly work in a factory. In UbiComp 2016. 1088--1099.
[28]
Takuya Maekawa and Shinji Watanabe. 2011. Unsupervised activity recognition with user's physical characteristics data. In International Symposium on Wearable Computers (ISWC 2011). 89--96.
[29]
Takuya Maekawa, Yutaka Yanagisawa, Yasue Kishino, Katsuhiko Ishiguro, Koji Kamei, Yasushi Sakurai, and Takeshi Okadome. 2010. Object-based activity recognition with heterogeneous sensors on wrist. In Pervasive 2010. 246--264.
[30]
Tomas Mikolov, Ilya Sutskever, Kai Chen, Greg S Corrado, and Jeff Dean. 2013. Distributed Representations of Words and Phrases and their Compositionality. In Advances in Neural Information Processing Systems 26. 3111--3119.
[31]
Francisco Javier Ordóñez Morales and Daniel Roggen. 2016. Deep convolutional feature transfer across mobile activity recognition domains, sensor modalities and locations. In The 2016 ACM International Symposium on Wearable Computers (ISWC). ACM, 92--99.
[32]
Meinard Müller. 2007. Dynamic time warping. Information retrieval for music and motion (2007), 69--84.
[33]
Hamed Pirsiavash and Deva Ramanan. 2012. Detecting activities of daily living in first-person camera views. In CVPR 2012. 2847--2854.
[34]
Agnieszka Radziwon, Arne Bilberg, Marcel Bogers, and Erik Skov Madsen. 2014. The Smart Factory: Exploring adaptive and flexible manufacturing solutions. Procedia Engineering 69 (2014), 1184--1190.
[35]
Juhi Ranjan and Kamin Whitehouse. 2015. Object hallmarks: Identifying object users using wearable wrist sensors. In UbiComp 2015. 51--61.
[36]
N. Ravi, N. Dandekar, P. Mysore, and M.L. Littman. 2005. Activity recognition from accelerometer data. In IAAI 2005, Vol. 20. 1541--1546.
[37]
Thomas Stiefmeier, Georg Ogris, Holger Junker, Paul Lukowicz, and Gerhard Tröster. 2006. Combining motion sensors and ultrasonic hands tracking for continuous activity recognition in a maintenance scenario. In 10th IEEE International Symposium on Wearable Computers (ISWC 2006). 97--104.
[38]
Thomas Stiefmeier, Daniel Roggen, and Gerhard Tröster. 2007. Fusion of string-matched templates for continuous activity recognition. In 11th IEEE International Symposium on Wearable Computers (ISWC 2007). 41--44.
[39]
Edison Thomaz, Irfan Essa, and Gregory D Abowd. 2015. A practical approach for recognizing eating moments with wrist-mounted inertial sensing. In UbiComp 2015. 1029--1040.
[40]
Jamie A Ward, Paul Lukowicz, and Gerhard Tröster. 2005. Gesture spotting using wrist worn microphone and 3-axis accelerometer. In The 2005 Joint Conference on Smart Objects and Ambient Intelligence: Innovative context-aware services: usages and technologies. 99--104.

Cited By

View all
  • (2024)PrISM-Q&A: Step-Aware Voice Assistant on a Smartwatch Enabled by Multimodal Procedure Tracking and Large Language ModelsProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36997598:4(1-26)Online publication date: 21-Nov-2024
  • (2024)CrossHAR: Generalizing Cross-dataset Human Activity Recognition via Hierarchical Self-Supervised PretrainingProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36595978:2(1-26)Online publication date: 15-May-2024
  • (2024)Preliminary Investigation of SSL for Complex Work Activity Recognition in Industrial Domain via MoIL2024 IEEE International Conference on Pervasive Computing and Communications Workshops and other Affiliated Events (PerCom Workshops)10.1109/PerComWorkshops59983.2024.10503195(465-468)Online publication date: 11-Mar-2024
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies
Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies  Volume 3, Issue 2
June 2019
802 pages
EISSN:2474-9567
DOI:10.1145/3341982
Issue’s Table of Contents
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 21 June 2019
Accepted: 01 June 2019
Received: 01 February 2019
Published in IMWUT Volume 3, Issue 2

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Activity recognition
  2. factory work
  3. wearable sensor

Qualifiers

  • Research-article
  • Research
  • Refereed

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)181
  • Downloads (Last 6 weeks)24
Reflects downloads up to 05 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2024)PrISM-Q&A: Step-Aware Voice Assistant on a Smartwatch Enabled by Multimodal Procedure Tracking and Large Language ModelsProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36997598:4(1-26)Online publication date: 21-Nov-2024
  • (2024)CrossHAR: Generalizing Cross-dataset Human Activity Recognition via Hierarchical Self-Supervised PretrainingProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36595978:2(1-26)Online publication date: 15-May-2024
  • (2024)Preliminary Investigation of SSL for Complex Work Activity Recognition in Industrial Domain via MoIL2024 IEEE International Conference on Pervasive Computing and Communications Workshops and other Affiliated Events (PerCom Workshops)10.1109/PerComWorkshops59983.2024.10503195(465-468)Online publication date: 11-Mar-2024
  • (2024)OpenPack: A Large-Scale Dataset for Recognizing Packaging Works in IoT-Enabled Logistic Environments2024 IEEE International Conference on Pervasive Computing and Communications (PerCom)10.1109/PerCom59722.2024.10494448(90-97)Online publication date: 11-Mar-2024
  • (2024)Worker Activity Recognition in Manufacturing Line Using Near-Body Electric FieldIEEE Internet of Things Journal10.1109/JIOT.2023.333037211:7(11554-11565)Online publication date: 1-Apr-2024
  • (2024)An unsupervised embedding method based on streaming videos for process monitoring in repetitive production systemsIISE Transactions10.1080/24725854.2024.2386415(1-22)Online publication date: 6-Aug-2024
  • (2024)Development of a wireless smart sensor system and case study on lifting risk assessmentManufacturing Letters10.1016/j.mfglet.2024.09.02741(229-240)Online publication date: Oct-2024
  • (2024)Fusion of kinematic and physiological sensors for hand gesture recognitionMultimedia Tools and Applications10.1007/s11042-024-18283-z83:26(68013-68040)Online publication date: 29-Jan-2024
  • (2023)AI-assisted Monitoring of Human-centered Assembly: A Comprehensive ReviewInternational Journal of Precision Engineering and Manufacturing-Smart Technology10.57062/ijpem-st.2023.00731:2(201-218)Online publication date: 1-Jul-2023
  • (2023)Human Activity Recognition with an HMM-Based Generative ModelSensors10.3390/s2303139023:3(1390)Online publication date: 26-Jan-2023
  • Show More Cited By

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Login options

Full Access

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media