EduARdo—Unity Components for Augmented Reality Environments †
<p>EduARdo Architecture.</p> "> Figure 2
<p>Hand interaction toolset architecture.</p> "> Figure 3
<p>Hand Interaction Toolset Components.</p> "> Figure 4
<p>External hand-tracking module’s components.</p> "> Figure 5
<p>Object movement behavior tool’s architecture.</p> "> Figure 6
<p>Object movement behavior tool’s components.</p> "> Figure 7
<p>Proposed system’s editors.</p> "> Figure 8
<p>SUS answers per question.</p> "> Figure 9
<p>Average scores on questions per Unity experience level.</p> "> Figure 10
<p>Mind map of the thematic areas retrieved from interviews.</p> ">
Abstract
:1. Introduction
2. Related Work
3. Materials and Methods
3.1. Hand Interaction Toolset
- Gesture is a component that describes a hand gesture and provides the ability to check if a certain gesture is currently being performed. Each gesture implementation must implement the IGesture Interface to ensure that every gesture has the same behavior.
- Fingers—a component that describes which of the retrieved hand points correspond to each finger. By using this component, all the points of a specific finger can be collectively retrieved as a list, wherein the item in the first place of the list is the base finger point and the last item is the fingertip point.
- GestureType is an enumerated value that defines a type of gesture. The gesture types defined in this enumerated value are the types assigned to each gesture component.
- GestureManager is a component that manages all the gestures. Using this class facilitates access to the gesture classes. Additionally, it can check when a specific gesture is completed and return the object currently being interacted with.
- HandAction is an abstract class created to act as a guideline for the future implementation of action classes. These action classes are the result of successful interaction with an object. Sample actions include movement and rotation. Developers can implement more action classes depending on their needs.
- SocketClient is a simple client responsible for sending the images from the device camera to the hand-tracking module and receiving the recognized hand points.
- ARFrameCapture is responsible for capturing the image from the camera of the device employed. The next step is to transform the image into a form that can be sent to and processed by the hand-tracking module.
- ActionHand corresponds to the functionality of the virtual hand. The logic of retrieving how and when the hand can interact with an object is defined here.
- VisualHand is the visualization of the retrieved hand points from the hand-tracking module. This component is responsible for placing the virtual hand on top of the user’s hand in the device viewport.
- Selector is a component responsible for containing the mechanism that will check if an interaction between the hand and an object is triggered.
- SelectionAction is a component that visually notifies the user about interactions with objects.
- RayCastProvider is a component providing the required raycaster for the corresponding task.
- Socker Server is the module’s server. It waits for devices using the Unity toolset to connect and receive messages.
- Image Process handles the processing of the image received from the devices. It shapes the decoded image in the correct form for the hand-tracking algorithms.
- Message Receiver is a function responsible for receiving and decoding the messages sent from the devices.
- Hand-Tracking Algorithm is the algorithm used to recognize and track the hand in images. This algorithm is in an isolated component whose replacement is an easy task.
3.2. Object Movement Behavior Configuration
- ObjectMovementBehavior is the component that holds the logic of each behavior. This component is attached to the BehaviorObject in the creation phase and contains the methods of this component that are used during run time so that the object can perform the requested behavior.
- BehaviorManager is a component that assists in the selection and instantiation of the correct behavior both in the application creation phase and during run time when the objects are instantiating in the scene.
- BahaviorObject is a prefab imported into Unity 3D game engine by the user and is the object that behavior is attached to.
- BehaviorSelectionEditor is a basic selector of predefined behaviors that the user can select to configure with the desired object.
- OrbitBehaviorEditor is a component that holds the configuration options concerning the orbit behavior that can be added to an object.
4. Experiment
4.1. System Usability Evaluation
4.1.1. Participants
4.1.2. Experiment Methodology
4.1.3. Tasks
4.1.4. Evaluation
4.2. System Acceptance by Educators
4.2.1. Participants
4.2.2. Interview Methodology
5. Results
5.1. System Evaluation (SUS)
5.2. Insights from Educators
5.2.1. Unity Development Environment
5.2.2. EduARdo
5.2.3. Requested Features
5.2.4. Scenarios of Usage
6. Discussion
6.1. SUS Results
6.2. Qualitative Results
6.3. Overview
7. Conclusions and Future Work
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Doolani, S.; Wessels, C.; Kanal, V.; Sevastopoulos, C.; Jaiswal, A.; Nambiappan, H.; Makedon, F. A Review of Extended Reality (XR) Technologies for Manufacturing Training. Technologies 2020, 8, 77. [Google Scholar] [CrossRef]
- Azuma, R.; Baillot, Y.; Behringer, R.; Feiner, S.; Julier, S.; MacIntyre, B. Recent Advances in Augmented Reality. IEEE Comput. Graph. Appl. 2001, 21, 34–47. [Google Scholar] [CrossRef] [Green Version]
- Milgram, P.; Kishino, F. A Taxonomy of Mixed Reality Visual Displays. IEICE Trans. Inf. Syst. 1994, 77, 1321–1329. [Google Scholar]
- Coomans, M.K.D.; Timmermans, H.J.P. Towards a Taxonomy of Virtual Reality User Interfaces. In Proceedings of the 1997 IEEE Conference on Information Visualization (Cat. No.97TB100165), London, UK, 27–29 August 1997; pp. 279–284. [Google Scholar]
- Alalwan, N.; Cheng, L.; Al-Samarraie, H.; Yousef, R.; Ibrahim Alzahrani, A.; Sarsam, S.M. Challenges and Prospects of Virtual Reality and Augmented Reality Utilization among Primary School Teachers: A Developing Country Perspective. Stud. Educ. Eval. 2020, 66, 100876. [Google Scholar] [CrossRef]
- Christinaki, E.; Vidakis, N.; Triantafyllidis, G. A Novel Educational Game for Teaching Emotion Identification Skills to Preschoolers with Autism Diagnosis. Comput. Sci. Inf. Syst. 2014, 11, 723–743. [Google Scholar] [CrossRef]
- Ghazwani, Y.; Smith, S. Interaction in Augmented Reality: Challenges to Enhance User Experience. In Proceedings of the 2020 4th International Conference on Virtual and Augmented Reality Simulations, Sydney, Australia, 14–16 February 2020; Association for Computing Machinery: New York, NY, USA, 2020; pp. 39–44. [Google Scholar]
- Che, Y.; Qi, Y. Detection-Guided 3D Hand Tracking for Mobile AR Applications. In Proceedings of the 2021 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), Bari, Italy, 4–8 October 2021; pp. 386–392. [Google Scholar]
- Wang, L.; Sun, K.; Dai, H.; Wang, W.; Huang, K.; Liu, A.X.; Wang, X.; Gu, Q. WiTrace: Centimeter-Level Passive Gesture Tracking Using OFDM Signals. IEEE Trans. Mob. Comput. 2021, 20, 1730–1745. [Google Scholar] [CrossRef]
- Ismail, A.W.; Fadzli, F.E.; Faizal, M.S.M. Augmented Reality Real-Time Drawing Application with a Hand Gesture on a Handheld Interface. In Proceedings of the 2021 IEEE 6th International Conference on Computing, Communication and Automation (ICCCA), Arad, Romania, 17–19 December 2021; pp. 418–423. [Google Scholar]
- Microsoft Azure Kinect DK. Available online: https://azure.microsoft.com/en-us/services/kinect-dk/ (accessed on 17 March 2022).
- LeapMotion Ultraleap for Developers. Available online: https://developer.leapmotion.com/ (accessed on 17 March 2022).
- Sensoryx VRfree-VRfree® Glove-Intuitive VR Interaction. Available online: https://www.sensoryx.com/products/vrfree/ (accessed on 17 March 2022).
- Chunduru, V.; Roy, M.; Chittawadigi, R.G. Hand Tracking in 3D Space Using MediaPipe and PnP Method for Intuitive Control of Virtual Globe. In Proceedings of the 2021 IEEE 9th Region 10 Humanitarian Technology Conference (R10-HTC), Bengaluru, India, 30 September–2 October 2021; pp. 1–6. [Google Scholar]
- Ren, Y.; Lu, J.; Beletchi, A.; Huang, Y.; Karmanov, I.; Fontijne, D.; Patel, C.; Xu, H. Hand Gesture Recognition Using 802.11ad MmWave Sensor in the Mobile Device. In Proceedings of the 2021 IEEE Wireless Communications and Networking Conference Workshops (WCNCW), Nanjing, China, 29 March–1 May 2021; pp. 1–6. [Google Scholar]
- Wang, T.; Qian, X.; He, F.; Hu, X.; Cao, Y.; Ramani, K. GesturAR: An Authoring System for Creating Freehand Interactive Augmented Reality Applications. In Proceedings of the 34th Annual ACM Symposium on User Interface Software and Technology, virtual, 10–14 October 2021; Association for Computing Machinery: New York, NY, USA; pp. 552–567. [Google Scholar]
- Pei, S.; Chen, A.; Lee, J.; Zhang, Y. Hand Interfaces: Using Hands to Imitate Objects in AR/VR for Expressive Interactions. In Proceedings of the Conference on Human Factors in Computing Systems 2022, New Orleans, LA, USA, 30 April–5 May 2022. [Google Scholar] [CrossRef]
- Yan, Y.; Yu, C.; Ma, X.; Yi, X.; Sun, K.; Shi, Y. VirtualGrasp: Leveraging Experience of Interacting with Physical Objects to Facilitate Digital Object Retrieval. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, Montreal, QC, Canada, 21–26 April 2018; Association for Computing Machinery: New York, NY, USA, 2018; pp. 1–13. [Google Scholar]
- Arora, R.; Kazi, R.H.; Kaufman, D.M.; Li, W.; Singh, K. MagicalHands: Mid-Air Hand Gestures for Animating in VR. In Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology, New Orleans, LA, USA, 20–23 October 2019; Association for Computing Machinery: New York, NY, USA, 2019; pp. 463–477. [Google Scholar]
- Zhao, Z.; Ma, X. ShadowPlay2.5D: A 360-Degree Video Authoring Tool for Immersive Appreciation of Classical Chinese Poetry. J. Comput. Cult. Herit. 2020, 13, 1–20. [Google Scholar] [CrossRef] [Green Version]
- Wang, Z.; Wang, H.; Yu, H.; Lu, F. Interaction With Gaze, Gesture, and Speech in a Flexibly Configurable Augmented Reality System. IEEE Trans. Hum. Mach. Syst. 2021, 51, 524–534. [Google Scholar] [CrossRef]
- Krauß, V.; Boden, A.; Oppermann, L.; Reiners, R. Current Practices, Challenges, and Design Implications for Collaborative AR/VR Application Development. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, Yokohama, Japan, 8–13 May 2021; Association for Computing Machinery: New York, NY, USA, 2021. [Google Scholar]
- Marín-Vega, H.; Alor-Hernández, G.; Colombo-Mendoza, L.; Bustos-López, M.; Zatarain Cabada, R. ZeusAR: A Process and an Architecture to Automate the Development of Augmented Reality Serious Games. Multimed. Tools Appl. 2022, 81, 2901–2935. [Google Scholar] [CrossRef]
- Unity. Available online: https://unity.com/ (accessed on 11 March 2022).
- Jin, Q.; Wang, D.; Deng, X.; Zheng, N.; Chiu, S. AR-Maze: A Tangible Programming Tool for Children Based on AR Technology. In Proceedings of the 17th ACM Conference on Interaction Design and Children, Trondheim, Norway, 19–22 June 2018; Association for Computing Machinery: New York, NY, USA, 2018; pp. 611–616. [Google Scholar]
- Ierache, J.; Mangiarua, N.A.; Becerra, M.E.; Igarza, S. Framework for the Development of Augmented Reality Applications Applied to Education Games. In Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Springer: Cham, Switzerland, 2018; Volume 10850, pp. 340–350. [Google Scholar] [CrossRef]
- Sannikov, S.; Zhdanov, F.; Chebotarev, P.; Rabinovich, P. Interactive Educational Content Based on Augmented Reality and 3D Visualization. Procedia Comput. Sci. 2015, 66, 720–729. [Google Scholar] [CrossRef] [Green Version]
- Seichter, H.; Looser, J.; Billinghurst, M. ComposAR: An Intuitive Tool for Authoring AR Applications. In Proceedings of the 2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality, Washington, DC, USA, 15–18 September 2008; pp. 177–178. [Google Scholar]
- Dahl, T.L.; Fjørtoft, S.O.; Landmark, A.D. Developing a Conceptual Framework for Smart Teaching: Using VR to Teach Kids How to Save Lives. Smart Innov. Syst. Technol. 2020, 188, 161–170. [Google Scholar] [CrossRef]
- Puggioni, M.P.; Frontoni, E.; Paolanti, M.; Pierdicca, R.; Malinverni, E.S.; Sasso, M. A Content Creation Tool for AR/VR Applications in Education: The ScoolAR Framework. In Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Springer: Cham, Switzerland, 2020; Volume 12243, pp. 205–219. [Google Scholar] [CrossRef]
- Wong, J.Y.; Yip, C.C.; Yong, S.T.; Chan, A.; Kok, S.T.; Lau, T.L.; Ali, M.T.; Gouda, E. BIM-VR Framework for Building Information Modelling in Engineering Education. Int. J. Interact. Mob. Technol. 2020, 14, 15–39. [Google Scholar] [CrossRef]
- Checa, D.; Gatto, C.; Cisternino, D.; De Paolis, L.T.; Bustillo, A. A Framework for Educational and Training Immersive Virtual Reality Experiences. In Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Springer: Cham, Switzerland, 2020; Volume 12243, pp. 220–228. [Google Scholar] [CrossRef]
- Muñoz-Cristóbal, J.A.; Prieto, L.P.; Asensio-Pérez, J.I.; Martínez-Monés, A.; Jorrín-Abellán, I.M.; Dimitriadis, Y. Deploying Learning Designs across Physical and Web Spaces: Making Pervasive Learning Affordable for Teachers. Pervasive Mob. Comput. 2014, 14, 31–46. [Google Scholar] [CrossRef]
- Logothetis, I.; Karampidis, K.; Vidakis, N.; Papadourakis, G. Hand Interaction Toolset for Augmented Reality Environments. In Proceedings of the International Conference on Extended Reality, Lecce, Italy, 6–8 July 2022; pp. 185–199. [Google Scholar]
- Barracuda. Available online: https://docs.unity3d.com/Packages/[email protected]/manual/index.html (accessed on 20 January 2023).
- Zhang, F.; Bazarevsky, V.; Vakunov, A.; Tkachenka, A.; Sung, G.; Chang, C.-L.; Grundmann, M. Mediapipe Hands: On-Device Real-Time Hand Tracking. arXiv 2020, arXiv:2006.10214. [Google Scholar]
- Brooke, J. Sus: A “quick and Dirty’usability. Usability Eval. Ind. 1996, 189, 4–7. [Google Scholar]
- Jeff Sauro Measuring Usability with the System Usability Scale. Available online: https://measuringu.com/sus/ (accessed on 4 April 2022).
- Kalogiannakis, M.; Papadakis, S.; Zourmpakis, A.-I. Gamification in Science Education. A Systematic Review of the Literature. Educ. Sci. 2021, 11, 22. [Google Scholar] [CrossRef]
Question | Average Score | Standard Deviation | Coefficient of Variation |
---|---|---|---|
| 3.94 | 1.020 | 0.111 |
| 3.72 | 0.447 | 0.262 |
| 4.44 | 0.761 | 0.350 |
| 3.22 | 1.030 | 0.171 |
| 4.50 | 0.600 | 0.579 |
| 3.61 | 0.590 | 0.133 |
| 4.50 | 0.500 | 0.425 |
| 3.72 | 0.558 | 0.111 |
| 4.44 | 0.684 | 0.436 |
| 3.61 | 0.487 | 0.154 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Logothetis, I.; Sfyrakis, M.; Vidakis, N. EduARdo—Unity Components for Augmented Reality Environments. Information 2023, 14, 252. https://doi.org/10.3390/info14040252
Logothetis I, Sfyrakis M, Vidakis N. EduARdo—Unity Components for Augmented Reality Environments. Information. 2023; 14(4):252. https://doi.org/10.3390/info14040252
Chicago/Turabian StyleLogothetis, Ilias, Myron Sfyrakis, and Nikolaos Vidakis. 2023. "EduARdo—Unity Components for Augmented Reality Environments" Information 14, no. 4: 252. https://doi.org/10.3390/info14040252
APA StyleLogothetis, I., Sfyrakis, M., & Vidakis, N. (2023). EduARdo—Unity Components for Augmented Reality Environments. Information, 14(4), 252. https://doi.org/10.3390/info14040252