Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3687272.3688304acmconferencesArticle/Chapter ViewAbstractPublication PageshaiConference Proceedingsconference-collections
research-article
Open access

A Gesture-based Interactive System for Automated Material Handling Vehicles: Implementation and Comparative Study

Published: 24 November 2024 Publication History

Abstract

Leveraging gesture-based controls within a remote interface for automated material handling operations presents a promising avenue to optimize experience in outdoor environments. We present a gesture-based interactive system designed to facilitate collaboration between a human operator and a semi-autonomous forklift in material handling processes. Our interface supports commands that include starting the forklift, defining the loading/unloading area and the amount of pallets that need to be transported. We tested and compared the proposed solution to a touch-based User Interface in a field study to explore its effectiveness and identify potential challenges for implementation and adoption in industrial real-world settings.

References

[1]
Roland Aigner, Mira Alida Haberfellner, and Michael Haller. 2022. spaceR: Knitting Ready-Made, Tactile, and Highly Responsive Spacer-Fabric Force Sensors for Continuous Input. In Proceedings of the 35th Annual ACM Symposium on User Interface Software and Technology. 1–15.
[2]
Susanna Aromaa, Nikos Frangakis, Domenico Tedone, Juhani Viitaniemi, and Iina Aaltonen. 2018. Digital human models in human factors and ergonomics evaluation of gesture interfaces. Proceedings of the ACM on Human-Computer Interaction 2, EICS (2018), 1–14.
[3]
Christian Bechinie, Setareh Zafari, Katja Gallhuber, Lukas Kröninger, Jaison Puthenkalam, and Manfred Tscheligi. 2024. Sensing the Machine: Evaluating Multi-modal Interaction for Intelligent Dynamic Guidance. In Proceedings of the 29th International Conference on Intelligent User Interfaces. 66–73.
[4]
John Brooke. 1996. Sus: a “quick and dirty’usability. Usability evaluation in industry 189, 3 (1996), 189–194.
[5]
Brice Burger, Isabelle Ferrané, Frédéric Lerasle, and Guillaume Infantes. 2012. Two-handed gesture recognition and fusion with speech to command a robot. Autonomous Robots 32 (2012), 129–147.
[6]
Manuel Reis Carneiro, Luís Pedro Rosa, Aníbal T De Almeida, and Mahmoud Tavakoli. 2022. Tailor-made smart glove for robot teleoperation, using printed stretchable sensors. In 2022 IEEE 5th International Conference on Soft Robotics (RoboSoft). IEEE, 722–728.
[7]
Andrew Correa, Matthew R Walter, Luke Fletcher, Jim Glass, Seth Teller, and Randall Davis. 2010. Multimodal interaction with an autonomous forklift. In 2010 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI). IEEE, 243–250.
[8]
Joana Duarte, A Torres Marques, and J Santos Baptista. 2021. Occupational accidents related to heavy machinery: a systematic review. Safety 7, 1 (2021), 21.
[9]
Michael R Epke, Lars Kooijman, and Joost CF De Winter. 2021. I see your gesture: a vr-based study of bidirectional communication between pedestrians and automated vehicles. Journal of advanced transportation 2021 (2021), 1–10.
[10]
Lisa Graichen, Matthias Graichen, and Josef F Krems. 2022. Effects of gesture-based interaction on driving behavior: a driving simulator study using the projection-based vehicle-in-the-loop. Human factors 64, 2 (2022), 324–342.
[11]
Sandra G Hart. 2006. NASA-task load index (NASA-TLX); 20 years later. In Proceedings of the human factors and ergonomics society annual meeting, Vol. 50. Sage publications Sage CA: Los Angeles, CA, 904–908.
[12]
Tomi Heimonen, Jaakko Hakulinen, Markku Turunen, Jussi PP Jokinen, Tuuli Keskinen, and Roope Raisamo. 2013. Designing gesture-based control for factory automation. In Human-Computer Interaction–INTERACT 2013: 14th IFIP TC 13 International Conference, Cape Town, South Africa, September 2-6, 2013, Proceedings, Part II 14. Springer, 202–209.
[13]
Eija Kaasinen, Anu-Hanna Anttila, Päivi Heikkilä, Jari Laarni, Hanna Koskinen, and Antti Väätänen. 2022. Smooth and resilient human–machine teamwork as an Industry 5.0 design challenge. Sustainability 14, 5 (2022), 2773.
[14]
Vinayak Kamath and Sandeep Bhat. 2014. Kinect sensor based real-time robot path planning using hand gesture and clap sound. In International Conference on Circuits, Communication, Control and Computing. IEEE, 129–134.
[15]
Naveen Kumar and Seul Chan Lee. 2022. Human-machine interface in smart factory: A systematic literature review. Technological Forecasting and Social Change 174 (2022), 121284.
[16]
Hongzhe Liu, Yulong Xi, Wei Song, Kyhyun Um, and Kyungeun Cho. 2013. Gesture-based NUI application for real-time path modification. In 2013 IEEE 11th International Conference on Dependable, Autonomic and Secure Computing. IEEE, 446–449.
[17]
Fadwa Mahiri, Aouatif Najoua, Souad Ben Souda, and Najia Amini. 2023. From Industry 4.0 to Industry 5.0: The Transition to Human Centricity and Collaborative Hybrid Intelligence. Journal of Hunan University Natural Sciences 50, 4 (2023).
[18]
Alexander G Mirnig, Peter Fröhlich, Setareh Zafari, Michael Gafert, Lukas Kröninger, and Manfred Tscheligi. 2023. A design space for automated material handling vehicles. Frontiers in Robotics and AI 10 (2023).
[19]
Dimitris Mourtzis, John Angelopoulos, and Nikos Panopoulos. 2023. The Future of the Human–Machine Interface (HMI) in Society 5.0. Future Internet 15, 5 (2023), 162.
[20]
Taha Müezzinoğlu and Mehmet Karaköse. 2020. Wearable glove based approach for human-UAV interaction. In 2020 IEEE International Symposium on Systems Engineering (ISSE). IEEE, 1–6.
[21]
Thamer Horbylon Nascimento, Cristiane BR Ferreira, Wellington G Rodrigues, and Fabrizzio Soares. 2020. Interaction with smartwatches using gesture recognition: a systematic literature review. In 2020 IEEE 44th Annual Computers, Software, and Applications Conference (COMPSAC). IEEE, 1661–1666.
[22]
Michael Nielsen, Moritz Störring, Thomas B Moeslund, and Erik Granum. 2004. A procedure for developing intuitive and ergonomic gesture interfaces for HCI. In Gesture-Based Communication in Human-Computer Interaction: 5th International Gesture Workshop, GW 2003, Genova, Italy, April 15-17, 2003, Selected Revised Papers 5. Springer, 409–420.
[23]
Luis Roda-Sanchez, Teresa Olivares, Celia Garrido-Hidalgo, José Luis de la Vara, and Antonio Fernandez-Caballero. 2021. Human-robot interaction in Industry 4.0 based on an Internet of Things real-time gesture control system. Integrated Computer-Aided Engineering 28, 2 (2021), 159–175.
[24]
Martin Schrepp, Andreas Hinderks, and Jörg Thomaschewski. 2017. Design and evaluation of a short version of the user experience questionnaire (UEQ-S). International Journal of Interactive Multimedia and Artificial Intelligence, 4 (6), 103-108. (2017).
[25]
Marjorie Skubic, Derek Anderson, Samuel Blisard, Dennis Perzanowski, and Alan Schultz. 2007. Using a hand-drawn sketch to control a team of robots. Autonomous Robots 22 (2007), 399–410.
[26]
Halit Bener Suay and Sonia Chernova. 2011. Humanoid robot control using depth camera. In Proceedings of the 6th international conference on Human-robot interaction. 401–402.
[27]
Antonios Sylari, Borja Ramis Ferrer, and Jose L Martinez Lastra. 2019. Hand gesture-based on-line programming of industrial robot manipulators. In 2019 IEEE 17th International Conference on Industrial Informatics (INDIN), Vol. 1. IEEE, 827–834.
[28]
Gilbert Tang, Phil Webb, 2018. The design and evaluation of an ergonomic contactless gesture control system for industrial robots. Journal of Robotics 2018 (2018).
[29]
Robert Tscharn, Marc Erich Latoschik, Diana Löffler, and Jörn Hurtienne. 2017. “Stop over there”: Natural gesture and speech interaction for non-critical spontaneous intervention in autonomous driving. In Proceedings of the 19th acm international conference on multimodal interaction. 91–100.
[30]
Valeria Villani, Beatrice Capelli, Cristian Secchi, Cesare Fantuzzi, and Lorenzo Sabattini. 2020. Humans interacting with multi-robot systems: a natural affect-based approach. Autonomous Robots 44 (2020), 601–616.
[31]
Valeria Villani, Lorenzo Sabattini, Nicola Battilani, and Cesare Fantuzzi. 2016. Smartwatch-enhanced interaction with an advanced troubleshooting system for industrial machines. IFAC-PapersOnLine 49, 19 (2016), 277–282.
[32]
Aleš Vysockỳ, Tomáš Poštulka, Jakub Chlebek, Tomáš Kot, Jan Maslowski, and Stefan Grushko. 2023. Hand gesture interface for robot path definition in collaborative applications: implementation and comparative study. Sensors 23, 9 (2023), 4219.
[33]
Stefan Waldherr, Roseli Romero, and Sebastian Thrun. 2000. A gesture based interface for human-robot interaction. Autonomous Robots 9 (2000), 151–173.
[34]
Matthew R Walter, Matthew Antone, Ekapol Chuangsuwanich, Andrew Correa, Randall Davis, Luke Fletcher, Emilio Frazzoli, Yuli Friedman, James Glass, Jonathan P How, 2015. A situationally aware voice-commandable robotic forklift working alongside people in unstructured outdoor environments. Journal of Field Robotics 32, 4 (2015), 590–628.
[35]
Fabian C Weigend, Xiao Liu, Shubham Sonawani, Neelesh Kumar, Venugopal Vasudevan, and Heni Ben Amor. 2024. iRoCo: Intuitive Robot Control From Anywhere Using a Smartwatch. arXiv preprint arXiv:2403.07199 (2024).
[36]
Jacob O Wobbrock, Meredith Ringel Morris, and Andrew D Wilson. 2009. User-defined gestures for surface computing. In Proceedings of the SIGCHI conference on human factors in computing systems. 1083–1092.
[37]
Xuchong Zhang, Ruiqiu Zhang, Liang Chen, and Xianmin Zhang. 2019. Natural gesture control of a delta robot using Leap Motion. In Journal of Physics: Conference Series, Vol. 1187. IOP Publishing, 032042.

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
HAI '24: Proceedings of the 12th International Conference on Human-Agent Interaction
November 2024
502 pages
ISBN:9798400711787
DOI:10.1145/3687272
This work is licensed under a Creative Commons Attribution International 4.0 License.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 24 November 2024

Check for updates

Author Tags

  1. Automated Forklift
  2. Gesture-based interaction
  3. Human centered Automation
  4. Material handling vehicle
  5. Multi-modal interface

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Conference

HAI '24
Sponsor:
HAI '24: International Conference on Human-Agent Interaction
November 24 - 27, 2024
Swansea, United Kingdom

Acceptance Rates

Overall Acceptance Rate 121 of 404 submissions, 30%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 86
    Total Downloads
  • Downloads (Last 12 months)86
  • Downloads (Last 6 weeks)34
Reflects downloads up to 29 Jan 2025

Other Metrics

Citations

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Login options

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media