Abstract
Although virtual reality (VR) training provides a realistic environment that offers an immersive experience for learners, it can affect the training progress by triggering a cognitive load or distracting learners from focusing on the main content of the training. In this study, the application of a gaze visualization interface to a medical safety training system was proposed to prevent learner distraction, which is an issue usually experienced in traditional VR training systems. An experiment was conducted to analyze the effects of applying the gaze visualization interface to a medical training system on the learning performance. Based on related studies, the following three metrics were selected to evaluate the learning performance: learning flow, achievement, and concentration. After the experiment, flow and achievement were evaluated through surveys and quizzes. These evaluation methods can also distort learners’ memories, making it difficult to analyze objective learning outcomes. Eye-tracking technology is useful for measuring learning performance in multimedia learning environments and can represent learner’s eye movements in objective metrics. Therefore, in this study, eye-tracking technology was used to evaluate concentration. The experimental results suggest that the gaze visualization interface had positive effects on learners’ concentration, learning flow, and achievement.
Similar content being viewed by others
References
Holding DH (1981) Human skills. Wiley, New York
Blascovich J, Bailenson J (2011) Infinite reality: avatars, eternal life, new worlds, and the dawn of the virtual revolution. William Morrow & Co, New York
Merchant Z, Goetz ET, Cifuentes L, Keeney-Kennicutt W, Davis TJ (2014) Effectiveness of virtual reality-based instruction on students’ learning outcomes in K-12 and higher education: a meta-analysis. Comput Educ 70:29–40. https://doi.org/10.1016/j.compedu.2013.07.033
Salas E, Bowers CA, Rhodenizer L (1998) It is not how much you have but how you use it: toward a rational use of simulation to support aviation training. Int J Aviat Psychol 8(3):197–208. https://doi.org/10.1207/s15327108ijap0803_2
Alaraj A, Lemole MG, Finkle JH, Yudkowsky R, Wallace A, Luciano C, Banerjee PP, Rizzi SH, Charbel FT (2011) Virtual reality training in neurosurgery: review of current status and future applications. Surg Neurol Int 2(52):52. https://doi.org/10.4103/2152-7806.80117
Larsen CR, Oestergaard J, Ottesen BS, Soerensen JL (2012) The efficacy of virtual reality simulation training in laparoscopy: a systematic review of randomized trials. Acta Obstet Gynecol Scand 91(9):1015–1028. https://doi.org/10.1111/j.1600-0412.2012.01482.x
Çaliskan O (2011) Virtual field trips in education of earth and environmental sciences. Procedia Soc Behav Sci 15:3239–3243. https://doi.org/10.1016/j.sbspro.2011.04.278
Rahman Y, Asish SM, Fisher NP, Bruce EC, Kulshreshth AK, Borst CW (2020) Exploring eye gaze visualization techniques for identifying distracted students in educational VR. In: 2020 IEEE conference on virtual reality and 3D user interfaces (VR). IEEE Publications, pp 868–877. https://doi.org/10.1109/VR46266.2020.00009
Borst CW, Lipari NG, Woodworth JW (2018) Teacher-guided educational vr: assessment of live and prerecorded teachers guiding virtual field trips. In: 2018 IEEE conference on virtual reality and 3D user interfaces (VR). IEEE Publications, pp 467–474. https://doi.org/10.1109/VR.2018.8448286
Makransky G, Terkildsen TS, Mayer RE (2019) Adding immersive virtual reality to a science lab simulation causes more presence but less learning. Learn Instr 60:225–236. https://doi.org/10.1016/j.learninstruc.2017.12.007
Jing A, May K, Lee G, Billinghurst M (2021) Eye see what you see: exploring how bi-directional augmented reality gaze visualisation influences co-located symmetric collaboration. Front Virtual Real 2:79
Holmqvist K, Nyström M, Andersson R, Dewhurst R, Jarodzka H, Van de Weijer J (2011) Eye tracking: a comprehensive guide to methods and measures. Oxford University Press, Oxford
Gog TV, Jarodzka H (2013) Eye tracking as a tool to study and enhance cognitive and metacognitive processes in computer-based learning environments. In: Azevedo R, Aleven V (eds) International handbook of metacognition and learning technologies. Springer, New York, pp 143–156
Rosch JL, Vogel-Walcutt JJ (2013) A review of eye-tracking applications as tools for training. Cogn Technol Work 15(3):313–327. https://doi.org/10.1007/s10111-012-0234-7
Negi S, Mitra R (2020) Fixation duration and the learning process: an eye tracking study with subtitled videos. J Eye Mov Res 13(6):1–15. https://doi.org/10.16910/jemr.13.6.1
Ke F, Lee S, Xu X (2016) Teaching training in a mixed-reality integrated learning environment. Comput Hum Behav 62:212–220. https://doi.org/10.1016/j.chb.2016.03.094
Chang KE, Chang CT, Hou HT, Sung YT, Chao HL, Lee CM (2014) Development and behavioral pattern analysis of a mobile guide system with augmented reality for painting appreciation instruction in an art museum. Comput Educ 71:185–197. https://doi.org/10.1016/j.compedu.2013.09.022
Stellmach S, Nacke L, Dachselt R (2010) Advanced gaze visualizations for three-dimensional virtual environments. In: Proceedings of the 2010 symposium on eye-tracking research & applications, pp 109–112. https://doi.org/10.1145/1743666.1743693
Lee GA, Kim S, Lee Y, Dey A, Piumsomboon T, Norman M, Billinghurst M (2017) Improving collaboration in augmented video conference using mutually shared gaze. In: International conference on artificial reality and telexistence, pp 197–204
Vann SW, Tawfik AA (2020) Flow theory and learning experience design in gamified learning environments. Learner and user experience research
Jackson SA, Marsh HW (1996) Development and validation of a scale to measure optimal experience: the Flow State Scale. J Sport Exer Psychol 18(1):17–35. https://doi.org/10.1123/jsep.18.1.17
Krathwohl DR (2002) A revision of Bloom’s taxonomy: an overview. Theor Pract 41(4):212–218. https://doi.org/10.1207/s15430421tip4104_2
Kiili K, Ketamo H, Kickmeier-Rust MD (2014) Evaluating the usefulness of eye tracking in game-based learning. Int J Serious Games 1(2):51–65. https://doi.org/10.17083/ijsg.v1i2.15
Albert B, Tullis T (2013) Measuring the user experience: collecting, analyzing, and presenting usability metrics. Newnes, Oxford
Liu PL (2014) Using eye tracking to understand learners’ reading process through the concept-mapping learning strategy. Comput Educ 78:237–249. https://doi.org/10.1016/j.compedu.2014.05.011
Mitra R, McNeal KS, Bondell HD (2017) Pupillary response to complex interdependent tasks: a cognitive-load theory perspective. Behav Res Methods 49(5):1905–1919. https://doi.org/10.3758/s13428-016-0833-y
Korea Institute for Healthcare Accreditation and Central Patient Safety Center (2022) Operating room patient safety practice guidelines. Korea Patient Safety Reporting & Learning System. https://www.kops.or.kr/portal/ifm/infoProvdStdrDetail.do?infoProvdNo=30&searchInfoProvdSe=RLM311001. Accessed 16 Oct 2022
Semsar A, McGowan H, Feng Y, Zahiri HR, Park A, Kleinsmith, A, Mentis H (2019) How trainees use the information from telepointers in remote instruction. In: Proceedings of the ACM on human-computer interaction, pp 1–20. https://doi.org/10.1145/3359195
D’Angelo S, Gergle D (2016) Gazed and confused: understanding and designing shared gaze for remote collaboration. In: Proceedings of the 2016 chi conference on human factors in computing systems, pp 2492–2496. https://doi.org/10.1145/2858036.2858499
Adhanom IB, MacNeilage P, Folmer E (2023) Eye tracking in virtual reality: a broad review of applications and challenges. Virtual Real. https://doi.org/10.1007/s10055-022-00738-z
Schuetz I, Fiehler K (2022) Eye tracking in virtual reality: vive pro eye spatial accuracy, precision, and calibration reliability. J Eye Mov Res 15(3):1–18. https://doi.org/10.16910/jemr.15.3.3
Zhao S, Cheng S, Zhu C (2023) 3D gaze vis: sharing eye tracking data visualization for collaborative work in VR environment, pp 1–11. arXiv:2303.10635. https://doi.org/10.48550/arXiv.2303.10635
Acknowledgements
This research was supported by the Institute of Information and Communications Technology Planning and Evaluation (IITP) grant funded by the Korean government (MSIT) (No.2021-0-00986, Development of Interaction Technology to Maximize Realization of Virtual Reality Contents using Multimodal Sensory Interface) and funded by the rising professor Financial Program at Changwon National University in 2023.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declare that they have no conflicts of interest.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Choi, H., Kwon, J. & Nam, S. Research on the application of gaze visualization interface on virtual reality training systems. J Multimodal User Interfaces 17, 203–211 (2023). https://doi.org/10.1007/s12193-023-00409-6
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12193-023-00409-6