Human–Robot Interaction in Industrial Settings: Perception of Multiple Participants at a Crossroad Intersection Scenario with Different Courtesy Cues
<p>Principal dimensions of: (<b>a</b>) the MiR 200 robot; (<b>b</b>) the MiR Charge 24 V.</p> "> Figure 2
<p>(<b>a</b>) Top view of the SICK laser scanners; (<b>b</b>) Configuration of the 3D cameras and SICK laser scanners, side view; (<b>c</b>) FoV of 118°.</p> "> Figure 3
<p>MiR 200 protective field mechanism: (<b>a</b>) The robot drives when its path is clear; (<b>b</b>) the robot activates the proactive stop when an obstacle is detected within its protective field.</p> "> Figure 4
<p>Range of the robot’s active protective field that changes with the robot’s speed, represented in millimeters: (<b>a</b>) Forward driving direction; (<b>b</b>) Backward driving direction.</p> "> Figure 5
<p>Clean and edited map corresponding to the total industrial area used for the tests: the arrows indicate the area in which the robot was allowed to circulate.</p> "> Figure 6
<p>Industrial environment: A crossroad-like configuration with simultaneous forward and backward scenarios.</p> "> Figure 7
<p>Scheme of the four courtesy cues tested.</p> "> Figure 8
<p>Experimental apparatus of the interaction’s HRC kinesic cue study conditions: <span class="html-italic">X<sub>d</sub></span> decelerate distance, <span class="html-italic">X<sub>r</sub></span> retreat distance, <span class="html-italic">X<sub>left</sub></span> move left distance; distances are represented in meters.</p> "> Figure 9
<p>Profile plots for trust (<b>a</b>) and mistrust (<b>b</b>) scores by kinesic courtesy cue and by point of view.</p> "> Figure 10
<p>Legibility of the robot courtesy kinesic cues: (<b>a</b>) Forward view; (<b>b</b>) Backward view.</p> "> Figure 11
<p>Observable signs of hesitation in the participants’ behavior while encountering the AMR related to each kinesic courtesy cue condition: (<b>a</b>) Forward view; (<b>b</b>) Backward view.</p> ">
Abstract
:1. Introduction
Objectives
2. Materials and Methods
2.1. Participants
2.2. Material and Experimental Setup
2.2.1. MiR 200 Specifications, Navigation, Control, and Safety
2.2.2. Experimental Setup
- (i)
- “stop”: The AMR stopped suddenly before the crossing area, made a two-second stop, and returned to its trajectory to the final position.
- (ii)
- “decelerate”: The AMR started to slow down its linear speed (v = 0.6 m/s) to v = 0.2 m/s at a distance of 1.0 m (represented by Xd in Figure 7) before stopping before the crossing area. Then, it stopped for two seconds and returned to its trajectory to the final position.
- (iii)
- “retreat”: The robot stopped suddenly before the crossing area, then retreated 1.0 m (Xr in Figure 7), stopped for two seconds, and returned to its trajectory to the final position.
- (iv)
- “retreat and move to the left”: The robot stopped suddenly before the crossing area, then retreated 1.0 m, and moved to the left by 0.2 m (Xleft in Figure 7) relative to the central point of the crossing area. Then, it stopped for two seconds and returned to its trajectory to the final position.
2.3. Procedure
2.4. Measures
2.4.1. Perceived Trust and Mistrust Assessment
2.4.2. Legibility Assessment
2.4.3. Behavioral Analysis
3. Results
3.1. Perceived Trust and Mistrust
- To assess perceived trust, the HTA was applied. A one-way ANOVA was applied, and the application condition of homogeneity of variance was verified (p > 0.05). Additionally, the Kolmogorov–Smirnov test was applied to assess the condition of normality. The normality of the residuals was confirmed (p > 0.05). The results of trust and distrust in the human counterpart are shown in Table 1.
3.2. Legibility Assessment
3.3. Behavioral Analysis
4. Discussion
Limitations and Future Work
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- European Commission. Industry 5.0—Towards a Sustainable, Human-Centric and Resilient European Industry; Publications Office of the European Union: Luxembourg, 2021. [Google Scholar] [CrossRef]
- Nahavandi, S. Industry 5.0—A Human-Centric Solution. Sustainability 2019, 11, 4371. [Google Scholar] [CrossRef] [Green Version]
- Berx, N.; Pintelon, L.; Decré, W. Psychosocial Impact of Collaborating with an Autonomous Mobile Robot: Results of an Exploratory Case Study. In Proceedings of the HRI’21: ACM/IEEE International Conference on Human-Robot Interaction, Boulder, CO, USA, 8–11 March 2021. [Google Scholar] [CrossRef]
- Fragapane, G.; de Koster, R.; Sgarbossa, F.; Strandhagen, J.O. Planning and control of autonomous mobile robots for intralogistics: Literature review and research agenda. Eur. J. Oper. Res. 2021, 294, 405–426. [Google Scholar] [CrossRef]
- Saeidi, H.; Wang, Y. Incorporating Trust and Self-Confidence Analysis in the Guidance and Control of (Semi)Autonomous Mobile Robotic Systems. IEEE Robot. Autom. Lett. 2019, 4, 239–246. [Google Scholar] [CrossRef]
- Rubio, F.; Valero, F.; Llopis-Albert, C. A review of mobile robots: Concepts, methods, theoretical framework, and applications. Int. J. Adv. Robot. Syst. 2019, 16, 1–22. [Google Scholar] [CrossRef] [Green Version]
- Liaqat, A.; Hutabarat, W.; Tiwari, D.; Tinkler, L.; Harra, D.; Morgan, B.; Taylor, A.; Lu, T.; Tiwari, A. Autonomous mobile robots in manufacturing: Highway Code development, simulation, and testing. Int. J. Adv. Manuf. Technol. 2019, 104, 4617–4628. [Google Scholar] [CrossRef] [Green Version]
- Chen, Y.; Yang, C.; Song, B.; Gonzalez, N.; Gu, Y.; Hu, B. Effects of Autonomous Mobile Robots on Human Mental Workload and System Productivity in Smart Warehouses: A Preliminary Study. Proc. Hum. Factors Ergon. Soc. Annu. Meet. 2020, 64, 1691–1695. [Google Scholar] [CrossRef]
- Toyoshima, A.; Nishino, N.; Chugo, D.; Muramatsu, S.; Yokota, S.; Hashimoto, H. Autonomous Mobile Robot Navigation: Consideration of the Pedestrian’s Dynamic Personal Space. In Proceedings of the 2018 IEEE 27th International Symposium on Industrial Electronics (ISIE), Cairns, QLD, Australia, 13–15 June 2018; pp. 1094–1099. [Google Scholar] [CrossRef]
- Sunada, K.; Yamada, Y.; Hattori, T.; Okamoto, S.; Hara, S. Extrapolation simulation for estimating human avoidability in human-robot coexistence systems. In Proceedings of the 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication, Paris, France, 9–13 September 2012; pp. 785–790. [Google Scholar] [CrossRef]
- Shen, Z.; Elibol, A.; Chong, N.Y. Understanding nonverbal communication cues of human personality traits in human-robot interaction. IEEE/CAA J. Autom. Sin. 2020, 7, 1465–1477. [Google Scholar] [CrossRef]
- Stroessner, S.J.; Benitez, J. The Social Perception of Humanoid and Non-Humanoid Robots: Effects of Gendered and Machinelike Features. Int. J. Soc. Robot. 2019, 11, 305–315. [Google Scholar] [CrossRef]
- Kaiser, F.G.; Glatte, K.; Lauckner, M. How to make nonhumanoid mobile robots more likable: Employing kinesic courtesy cues to promote appreciation. Appl. Ergon. 2019, 78, 70–75. [Google Scholar] [CrossRef]
- Lichtenthäler, C.; Kirsch, A. Towards Legible Robot Navigation—How to Increase the Intend Expressiveness of Robot Navigation Behavior. In Proceedings of the International Conference on Social Robotics—Workshop Embodied Communication of Goals and Intentions, Bristol, UK, 27–29 October 2013. [Google Scholar]
- Gildert, N.; Millard, A.; Pomfret, A.; Timmis, J. The Need for Combining Implicit and Explicit Communication in Cooperative Robotic Systems. Front. Robot. AI 2018, 5, 65. [Google Scholar] [CrossRef] [Green Version]
- Sisbot, E.A.; Marin-Urias, L.F.; Alami, R.; Simeon, T. A Human Aware Mobile Robot Motion Planner. IEEE Trans. Robot. 2007, 23, 874–883. [Google Scholar] [CrossRef] [Green Version]
- Leroy, T.; Christophe, V.; Delelis, G.; Corbeil, M.; Nandrino, J.L. Social Affiliation as a Way to Socially Regulate Emotions: Effects of Others’ Situational and Emotional Similarities. Curr. Res. Soc. Psychol. Univ. Iowa 2010, 16. [Google Scholar]
- Gualtieri, L.; Rauch, E.; Vidoni, R. Emerging research fields in safety and ergonomics in industrial collaborative robotics: A systematic literature review. Robot. Comput. Integr. Manuf. 2021, 67, 101998. [Google Scholar] [CrossRef]
- Gualtieri, L.; Fraboni, F.; De Marchi, M.; Rauch, E. Evaluation of Variables of Cognitive Ergonomics in Industrial Human-Robot Collaborative Assembly Systems. In Proceedings of the 21st Congress of the International Ergonomics Association (IEA 2021); Springer: Cham, Switzerland, 2021; pp. 266–273. [Google Scholar] [CrossRef]
- Fista, B.; Azis, H.A.; Aprilya, T.; Saidatul, S.; Sinaga, M.K.; Pratama, J.; Syalfinaf, F.A.; Steven; Amalia, S. Review of Cognitive Ergonomic Measurement Tools. IOP Conf. Ser. Mater. Sci. Eng. 2019, 598, 012131. [Google Scholar] [CrossRef]
- Rubio, S.; Diaz, E.; Martin, J.; Puente, J.M. Evaluation of Subjective Mental Workload: A Comparison of SWAT, NASA-TLX, and Workload Profile Methods. Appl. Psychol. 2004, 53, 61–86. [Google Scholar] [CrossRef]
- Hetherington, N.J.; Lee, R.; Haase, M.; Croft, E.A.; Van der Loos, H.F.M. Mobile Robot Yielding Cues for Human-Robot Spatial Interaction. In Proceedings of the 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Prague, Czech Republic, 27 September–1 October 2021; pp. 3028–3033. [Google Scholar] [CrossRef]
- Mobile Industrial Robots A/S (MiR). MiR a Better Way. 2019. Available online: https://epl-si.com/produto/mir200/ (accessed on 24 January 2022).
- Tzafestas, S.G. Mobile Robot Control and Navigation: A Global Overview. J. Intell. Robot. Syst. 2018, 91, 35–58. [Google Scholar] [CrossRef]
- Recker, T.; Heilemann, F.; Raatz, A. Handling of large and heavy objects using a single mobile manipulator in combination with a roller board. Procedia CIRP 2020, 97, 21–26. [Google Scholar] [CrossRef]
- Recker, T.; Zhou, B.; Stüde, M.; Wielitzka, M.; Ortmaier, T.; Raatz, A. LiDAR-Based Localization for Formation Control of Multi-Robot Systems. In Annals of Scientific Society for Assembly, Handling and Industrial Robotics 2021; Schüppstuhl, T., Tracht, K., Raatz, A., Eds.; Springer International Publishing: Cham, Switzerland, 2022; pp. 363–373. [Google Scholar]
- Mobile Industrial Robots A/S (MiR). User Guide (En) MiR 200, 3rd ed.; Mobile Industrial Robots A/S: Odense, Denmark, 2020; pp. 1–173. [Google Scholar]
- Siegwart, R.; Nourbakhsh, L.R. Introduction to Autonomous Mobile Robots; MIT Press: Cambridge, MA, USA, 2004. [Google Scholar] [CrossRef]
- Mobile Industrial Robots A/S. MiR Robot Safety. Available online: https://www.mobile-industrial-robots.com/insights/amr-safety/mir-robot-safety/ (accessed on 26 January 2022).
- Wadsten, J.; Klemets, R.E. Automated Deliverance of Goods by an Automated Guided Vehicle—Case study of the testing and implementation of an AGV within the production at Volvo Group AB, Tuve Gothenburg. Bachelor Thesis, Mechanical Engineering, Chalmers University of Technology, Gothenburg, Sweden, 14 June 2019. [Google Scholar]
- Lauckner, M.; Kobiela, F.; Manzey, D. ‘Hey Robot, Please Step Back!’—Exploration of a Spatial Threshold of Comfort for Human-Mechanoid Spatial Interaction in a Hallway Scenario. In Proceedings of the 23rd IEEE International Symposium on Robot and Human Interactive Communication, Edinburgh, UK, 25–29 August 2014; pp. 780–787. [Google Scholar] [CrossRef]
- Farrell, P.S.E. The Hysteresis Effect. Hum. Factors J. Hum. Factors Ergon. Soc. 1999, 41, 226–240. [Google Scholar] [CrossRef]
- Bicho, E.; Schoner, G.; Vaz, F. Modelo Dinâmico Neuronal Para a Percepção Categorial Da Fala. Electrónica Telecomunicações 1999, 2, 617. [Google Scholar]
- Jian, J.-Y.; Bisantz, A.; Drury, C.G. Foundations for an Empirically Determined Scale of Trust in Automated Systems. Int. J. Cogn. Ergon. 2000, 4, 53–71. [Google Scholar] [CrossRef]
- Sadrfaridpour, B.; Saeidi, H.; Wang, Y. An integrated framework for human-robot collaborative assembly in hybrid manufacturing cells. In Proceedings of the 2016 IEEE International Conference on Automation Science and Engineering (CASE), Fort Worth, TX, USA, 21–25 August 2016; pp. 462–467. [Google Scholar] [CrossRef]
- Hancock, P.A.; Billings, D.R.; Schaefer, K.E.; Chen, J.Y.C.; De Visser, E.J.; Parasuraman, R. A Meta-Analysis of Factors Affecting Trust in Human-Robot Interaction. Hum. Factors J. Hum. Factors Ergon. Soc. 2011, 53, 517–527. [Google Scholar] [CrossRef] [PubMed]
- Freedy, A.; DeVisser, E.; Weltman, G.; Coeyman, N. Measurement of trust in human-robot collaboration. In Proceedings of the 2007 International Symposium on Collaborative Technologies and Systems, Orlando, FL, USA, 25 May 2007; pp. 106–114. [Google Scholar] [CrossRef]
- Park, E.; Jenkins, Q.; Jiang, X. Measuring trust of human operators in new generation rescue robots. Proc. JFPS Int. Symp. Fluid Power 2008, 2008, 489–492. [Google Scholar] [CrossRef]
- Dondrup, C.; Lichtenthäler, C.; Hanheide, M. Hesitation signals in human-robot head-on encounters: A Pilot Study. In Proceedings of the HRI’14: ACM/IEEE International Conference on Human-Robot Interaction, Bielefeld, Germany, 3–6 March 2014; pp. 154–155. [Google Scholar] [CrossRef]
- Brooks, C.; Szafir, D. Visualization of Intended Assistance for Acceptance of Shared Control. In Proceedings of the 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Las Vegas, NV, USA, 25–29 October 2021; pp. 11425–11430. [Google Scholar] [CrossRef]
- Evans, D.C.; Fendley, M. A multi-measure approach for connecting cognitive workload and automation. Int. J. Hum.-Comput. Stud. 2017, 97, 182–189. [Google Scholar] [CrossRef]
- Jenkins, Q.; Jiang, X. Measuring Trust and Application of Eye Tracking in Human Robotic Interaction. In Proceedings of the IIE Annual Conference and Expo 2010, Cancun, Mexico, 5–9 June 2010. [Google Scholar]
Forward | Backward | |
---|---|---|
Trust | F(4,80) = 1.082, p = 0.371 | F(4,80) = 0.486, p = 0.746 |
Mistrust | F(4,80) = 0.564, p = 0.689 | F(4,80) = 0.966, p = 0.431 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Alves, C.; Cardoso, A.; Colim, A.; Bicho, E.; Braga, A.C.; Cunha, J.; Faria, C.; Rocha, L.A. Human–Robot Interaction in Industrial Settings: Perception of Multiple Participants at a Crossroad Intersection Scenario with Different Courtesy Cues. Robotics 2022, 11, 59. https://doi.org/10.3390/robotics11030059
Alves C, Cardoso A, Colim A, Bicho E, Braga AC, Cunha J, Faria C, Rocha LA. Human–Robot Interaction in Industrial Settings: Perception of Multiple Participants at a Crossroad Intersection Scenario with Different Courtesy Cues. Robotics. 2022; 11(3):59. https://doi.org/10.3390/robotics11030059
Chicago/Turabian StyleAlves, Carla, André Cardoso, Ana Colim, Estela Bicho, Ana Cristina Braga, João Cunha, Carlos Faria, and Luís A. Rocha. 2022. "Human–Robot Interaction in Industrial Settings: Perception of Multiple Participants at a Crossroad Intersection Scenario with Different Courtesy Cues" Robotics 11, no. 3: 59. https://doi.org/10.3390/robotics11030059
APA StyleAlves, C., Cardoso, A., Colim, A., Bicho, E., Braga, A. C., Cunha, J., Faria, C., & Rocha, L. A. (2022). Human–Robot Interaction in Industrial Settings: Perception of Multiple Participants at a Crossroad Intersection Scenario with Different Courtesy Cues. Robotics, 11(3), 59. https://doi.org/10.3390/robotics11030059