A Comparison of Two Interaction Paradigms for Training Low Cost Automation Assembly in Virtual Environments †
<p>The architecture of the system. The devices circled in orange exchange data with SteamVR, whereas the Manus gloves exchange data with the Manus Core application.</p> "> Figure 2
<p>The three different hands poses tracked by the system: open hand (<b>left</b>), grab pose (<b>center</b>), and pointing pose (<b>right</b>).</p> ">
Abstract
:1. Introduction
- Zero or very little power consumption;
- A few actuators;
- High flexibility;
- High reliability;
- Small dimensions;
- Minimum maintenance;
- Minimum investment/running costs.
- R1: is the hand-tracking interaction paradigm more usable than the controller-based paradigm?
- R2: is the level of workload involved in the hand-tracking interaction paradigm lower than that of the controller-based paradigm?
2. State of the Art
2.1. Virtual Environments
2.2. Hand-Tracking Interfaces
2.3. Contribution
3. The Immersive Virtual Reality Prototyping Environment
- Virtual menus: these menus provide both information and buttons, which can be virtually pressed to activate different options, e.g., adding 3D objects to the scene;
- Grab & release: used to select and move objects in the 3D scene;
- Connecting objects: used to create complex LCA by connecting simple objects by colliding them; connections depend on the concepts of compatibility and connections points, which have been explored in [8];
- Separating objects: used to remove an object or objects group from a complex object;
- Scaling objects: used to change one or more dimensions of a simple object, e.g., the length of a pipe.
- Adjusting objects: used to change an object or an object group position without detaching it from a complex object, e.g., to adjust the vertical position of a pipe;
- Copy position: used to copy one of the three object’s coordinates to or from another one;
- Copy size: used to copy the object size from another one of the same type;
- Split: used to split a complex object into its parts;
- Distance: used to compute the gap between the current object and another.
- Menu button: used for menu management. When pressed, the menus available in the application are opened or closed according to the system’s status;
- Trigger button: this is the button on the back of the controller; when pressed, it starts the actions of grabbing an object; when released, the grabbed object is detached from the hand.
- Grip button: this is the button on the sides of the controller; when pressed, if pointing at an object made up of a complex structure, it allows you to detach it from the structure;
- Touchpad: this is the central button on the front of the controller; when pressed together with the trigger, if pointing at a compound object within a complex structure, it allows you to change its position without detaching it, thus facilitating fine pose correction.
- Open hand: detected when the user completely opens one hand. Used to release an object after a grab action, or to show the main menu panel, if the hand is facing the user.
- Grab pose: detected when the user completely closes one hand. If the hand collides with an object, the grab action starts.
- Pointing pose: used to select different options in the virtual menus, or to open contextual menus by touching objects in the 3D scene.
4. Test and Results
4.1. Test Procedures
- Introduction: the user is introduced to the proposed research, and the test procedure is explained in detail;
- User data: the user fills out a preliminary form to collect general user information and to ask some questions pertaining to their previous experiences with VR;
- Hardware setup: the user is equipped with the VIVE HTC Pro headset; if the Manus gloves are used, the calibration procedure is carried out;
- Virtual environment tutorial: when the application starts, after selecting the desired interaction interface, a set of instruction panels guides the user, showing the application functionalities; each panel is composed of a textual description, images, and/or videos displaying the application usage; if the controllers are displayed in the panel, interacting with the different buttons of the physical controller highlights its graphical counterpart on the virtual panel. The panels guide the user into practicing with the available commands, e.g., adding objects to the scene, grabbing, connecting, or scaling them. When the tutorial ends, testers are given 5 more minutes to explore the virtual environment further and to try out the available commands and menus;
- LCA assembly task: after the tutorial step, the users are asked to assemble an LCA starting from an existing reference displayed in the virtual environment. The 3D reference is not only a graphical representation of the target LCA; it was assembled following the virtual environment mechanism. Thus, it is possible to measure the distances between different parts of the reference LCA or to copy the size of its parts;
- The user repeated steps 5–6 using the other interface;
- Interview: the user is then interviewed regarding the overall experience.
4.2. Measurements
4.3. Results
4.4. Discussion
- Normality data assessment by the Shapiro–Wilk test;
- Case Data Normally Distributed: dependent t-test;
- Case Data Non-Normally Distributed: if the distribution of differences (DDs) is symmetrical, Wilcoxon signed-rank test. Otherwise, sign test.
5. Conclusions and Future Works
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Abbreviations
CAVE | Cave Automatic Virtual Environment |
DD | Distribution of Differences |
IVE | Immersive Virtual Environment |
IVR | Immersive Virtual Reality |
LCA | Low-Cost Automation |
SDK | Software Development Kit |
SUS | System Usability Scale |
TLX | Task Load Index |
VADE | Virtual Assembly Design Environment |
VR | Virtual Reality |
VSE | Virtual Simulation Environment |
References
- Brettel, M.; Friederichsen, N.; Keller, M.; Rosenberg, M. How virtualization, decentralization and network building change the manufacturing landscape: An Industry 4.0 Perspective. Int. J. Inf. Commun. Eng. 2014, 8, 37–44. [Google Scholar]
- Erboz, G. How to define industry 4.0: Main pillars of industry 4.0. Manag. Trends Dev. Enterp. Glob. Era 2017, 761, 767. [Google Scholar]
- Yin, S.; Kaynak, O. Big data for modern industry: Challenges and trends [point of view]. Proc. IEEE 2015, 103, 143–146. [Google Scholar] [CrossRef]
- Berni, A.; Borgianni, Y. Applications of virtual reality in engineering and product design: Why, what, how, when and where. Electronics 2020, 9, 1064. [Google Scholar] [CrossRef]
- Gaudette, D.A. Low Cost Automation. SAE Trans. 1998, 107, 1108–1113. [Google Scholar]
- Katayama, H.; Sawa, K.; Hwang, R.; Ishiwatari, N.; Hayashi, N. Analysis and classification of Karakuri technologies for reinforcement of their visibility, improvement and transferability: An attempt for enhancing lean management. In Proceedings of the PICMET’14 Conference: Portland International Center for Management of Engineering and Technology, Infrastructure and Service Integration, Kanazawa, Japan, 27–31 July 2014; pp. 1895–1906. [Google Scholar]
- Anggrahini, D.; Prasetyawan, Y.; Diartiwi, S.I. Increasing production efficiency using Karakuri principle (a case study in small and medium enterprise). In IOP Conference Series: Materials Science and Engineering; IOP Publishing: Bristol, UK, 2020; Volume 852, p. 012117. [Google Scholar]
- Manuri, F.; Gravina, N.; Sanna, A.; Brizzi, P. Prototyping industrial workstation in the Metaverse: A Low Cost Automation assembly use case. In Proceedings of the 2022 IEEE International Conference on Metrology for Extended Reality, Artificial Intelligence and Neural Engineering (MetroXRAINE), Rome, Italy, 26–28 October 2022; pp. 133–138. [Google Scholar]
- Jayaram, S.; Connacher, H.I.; Lyons, K.W. Virtual assembly using virtual reality techniques. Comput.-Aided Des. 1997, 29, 575–584. [Google Scholar] [CrossRef]
- Jayaram, S.; Jayaram, U.; Wang, Y.; Tirumali, H.; Lyons, K.; Hart, P. VADE: A virtual assembly design environment. IEEE Comput. Graph. Appl. 1999, 19, 44–50. [Google Scholar] [CrossRef] [Green Version]
- Dewar, R.G.; Carpenter, I.D.; Ritchie, J.M.; Simmons, J.E. Assembly planning in a virtual environment. In Proceedings of the Innovation in Technology Management, The Key to Global Leadership. PICMET’97, Portland, OR, USA, 27–31 July 1997; pp. 664–667. [Google Scholar]
- Seth, A.; Vance, J.M.; Oliver, J.H. Virtual reality for assembly methods prototyping: A review. Virtual Real. 2011, 15, 5–20. [Google Scholar] [CrossRef] [Green Version]
- Liu, K.; Yin, X.; Fan, X.; He, Q. Virtual Assembly with Physical Information: A Review. Assem. Autom. 2015, 35, 206–220. [Google Scholar] [CrossRef]
- Francillette, Y.; Boucher, E.; Bouzouane, A.; Gaboury, S. The virtual environment for rapid prototyping of the intelligent environment. Sensors 2017, 17, 2562. [Google Scholar] [CrossRef] [Green Version]
- Noghabaei, M.; Asadi, K.; Han, K. Virtual manipulation in an immersive virtual environment: Simulation of virtual assembly. In Computing in Civil Engineering 2019: Visualization, Information Modeling, and Simulation; American Society of Civil Engineers: Reston, VA, USA, 2019; pp. 95–102. [Google Scholar]
- Sharma, S.; Bodempudi, S.T.; Arrolla, M.; Upadhyay, A. Collaborative Virtual Assembly Environment for Product Design. In Proceedings of the 2019 International Conference on Computational Science and Computational Intelligence (CSCI), Las Vegas, NV, USA, 5–7 December 2019; pp. 606–611. [Google Scholar]
- Pan, X.; Cui, X.; Huo, H.; Jiang, Y.; Zhao, H.; Li, D. Virtual Assembly of Educational Robot Parts Based on VR Technology. In Proceedings of the 2019 IEEE 11th International Conference on Engineering Education (ICEED), Kanazawa, Japan, 6–7 November 2019; pp. 1–5. [Google Scholar]
- Halabi, O. Immersive virtual reality to enforce teaching in engineering education. Multimed. Tools Appl. 2020, 79, 2987–3004. [Google Scholar] [CrossRef] [Green Version]
- Shamsuzzoha, A.; Toshev, R.; Vu Tuan, V.; Kankaanpaa, T.; Helo, P. Digital factory–virtual reality environments for industrial training and maintenance. Interact. Learn. Environ. 2021, 29, 1339–1362. [Google Scholar] [CrossRef]
- Almeida, L.; Lopes, E.; Yalçinkaya, B.; Martins, R.; Lopes, A.; Menezes, P.; Pires, G. Towards natural interaction in immersive reality with a cyber-glove. In Proceedings of the 2019 IEEE International Conference on Systems, Man and Cybernetics (SMC), Bari, Italy, 6–9 October 2019; pp. 2653–2658. [Google Scholar] [CrossRef]
- Khundam, C.; Vorachart, V.; Preeyawongsakul, P.; Hosap, W.; Noël, F. A comparative study of interaction time and usability of using controllers and hand tracking in virtual reality training. Informatics 2021, 8, 60. [Google Scholar] [CrossRef]
- Rantamaa, H.R.; Kangas, J.; Kumar, S.K.; Mehtonen, H.; Järnstedt, J.; Raisamo, R. Comparison of a VR Stylus with a Controller, Hand Tracking, and a Mouse for Object Manipulation and Medical Marking Tasks in Virtual Reality. Appl. Sci. 2023, 13, 2251. [Google Scholar] [CrossRef]
- Brooke, J. System Usability Scale (SUS): A “Quick and Dirty” Usability Scale. Usability Evaluation in Industry; Taylor and Francis: Abingdon, UK, 1986. [Google Scholar]
- Hart, S.G. NASA Task Load Index (TLX). 1986. Available online: https://ntrs.nasa.gov/api/citations/20000021488/downloads/20000021488.pdf (accessed on 19 April 2023).
- Bangor, A.; Kortum, P.; Miller, J. Determining what individual SUS scores mean: Adding an adjective rating scale. J. Usability Stud. 2009, 4, 114–123. [Google Scholar]
- Zhou, W.; Xu, J.; Jiang, Q.; Chen, Z. No-reference quality assessment for 360-degree images by analysis of multifrequency information and local-global naturalness. IEEE Trans. Circuits Syst. Video Technol. 2021, 32, 1778–1791. [Google Scholar] [CrossRef]
- Kim, H.G.; Lim, H.T.; Ro, Y.M. Deep virtual reality image quality assessment with human perception guider for omnidirectional image. IEEE Trans. Circuits Syst. Video Technol. 2019, 30, 917–928. [Google Scholar] [CrossRef]
SUS Scores | Completion Times | Errors | Questions | |||||
---|---|---|---|---|---|---|---|---|
User | VIVE | Manus | VIVE | Manus | VIVE | Manus | VIVE | Manus |
1 | 85 | 77.5 | 1378 | 772 | 2 | 2 | 3 | 1 |
2 | 35 | 75 | 828 | 754 | 1 | 0 | 1 | 2 |
3 | 75 | 80 | 522 | 400 | 0 | 1 | 0 | 1 |
4 | 57.5 | 82.5 | 530 | 360 | 0 | 1 | 0 | 1 |
5 | 67.5 | 60 | 866 | 736 | 3 | 3 | 2 | 1 |
6 | 65 | 80 | 400 | 702 | 0 | 0 | 1 | 3 |
7 | 70 | 90 | 1674 | 883 | 1 | 1 | 4 | 1 |
8 | 60 | 67.5 | 707 | 743 | 1 | 2 | 2 | 2 |
9 | 87.5 | 87.5 | 486 | 545 | 1 | 0 | 1 | 0 |
10 | 80 | 80 | 831 | 601 | 0 | 1 | 2 | 0 |
11 | 90 | 87.5 | 366 | 307 | 0 | 1 | 0 | 0 |
12 | 57.5 | 75 | 609 | 571 | 1 | 1 | 2 | 0 |
Average | 69.17 | 78.54 | 766.42 | 614.5 | 0.84 | 1.08 | 1.5 | 1 |
NASA-TLX Weighted Total Work Load | ||
---|---|---|
User | VIVE | MANUS |
1 | 49.33 | 42.00 |
2 | 89.33 | 45.00 |
3 | 40.67 | 51.33 |
4 | 43.33 | 24.67 |
5 | 58.00 | 46.67 |
6 | 36.67 | 19.33 |
7 | 44.00 | 18.00 |
8 | 47.33 | 38.00 |
9 | 24.67 | 37.33 |
10 | 48.00 | 28.67 |
11 | 52.67 | 55.33 |
12 | 48.67 | 42.67 |
Mean | 48.56 | 37.42 |
NASA-TLX Weighted Subscores—VIVE Controllers | ||||||
---|---|---|---|---|---|---|
User | Mental | Physical | Temporal | Performance | Effort | Frustration |
1 | 120 | 0 | 100 | 300 | 200 | 20 |
2 | 360 | 0 | 400 | 180 | 100 | 300 |
3 | 300 | 50 | 90 | 80 | 80 | 10 |
4 | 160 | 0 | 200 | 100 | 40 | 150 |
5 | 210 | 20 | 200 | 80 | 300 | 60 |
6 | 160 | 20 | 10 | 300 | 60 | 0 |
7 | 240 | 0 | 100 | 150 | 50 | 120 |
8 | 150 | 0 | 80 | 250 | 80 | 150 |
9 | 160 | 0 | 60 | 100 | 40 | 10 |
10 | 120 | 0 | 10 | 240 | 200 | 150 |
11 | 210 | 40 | 150 | 150 | 240 | 0 |
12 | 100 | 0 | 350 | 120 | 40 | 120 |
NASA-TLX Weighted Subscores—Manus Gloves | ||||||
---|---|---|---|---|---|---|
User | Mental | Physical | Temporal | Performance | Effort | Frustration |
1 | 240 | 0 | 120 | 150 | 60 | 60 |
2 | 200 | 20 | 200 | 200 | 100 | 0 |
3 | 350 | 20 | 240 | 90 | 30 | 40 |
4 | 60 | 0 | 80 | 120 | 20 | 90 |
5 | 160 | 120 | 20 | 280 | 120 | 0 |
6 | 80 | 20 | 20 | 150 | 20 | 0 |
7 | 90 | 10 | 50 | 80 | 0 | 40 |
8 | 100 | 0 | 150 | 200 | 80 | 40 |
9 | 200 | 0 | 80 | 100 | 80 | 100 |
10 | 150 | 40 | 10 | 150 | 80 | 0 |
11 | 240 | 140 | 210 | 100 | 140 | 0 |
12 | 150 | 20 | 240 | 150 | 80 | 0 |
NASA-TLX Raw Individual Mean Scores | ||
---|---|---|
VIVE | MANUS | |
Mental demand | 55.83 | 47.50 |
Physical demand | 29.17 | 24.17 |
Temporal demand | 45.83 | 34.17 |
Performance | 48.33 | 34.17 |
Effort | 52.50 | 36.67 |
Frustration | 35.83 | 26.67 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Manuri, F.; Decataldo, F.; Sanna, A.; Brizzi, P. A Comparison of Two Interaction Paradigms for Training Low Cost Automation Assembly in Virtual Environments. Information 2023, 14, 340. https://doi.org/10.3390/info14060340
Manuri F, Decataldo F, Sanna A, Brizzi P. A Comparison of Two Interaction Paradigms for Training Low Cost Automation Assembly in Virtual Environments. Information. 2023; 14(6):340. https://doi.org/10.3390/info14060340
Chicago/Turabian StyleManuri, Federico, Federico Decataldo, Andrea Sanna, and Paolo Brizzi. 2023. "A Comparison of Two Interaction Paradigms for Training Low Cost Automation Assembly in Virtual Environments" Information 14, no. 6: 340. https://doi.org/10.3390/info14060340