Nothing Special   »   [go: up one dir, main page]

Skip to main content
Log in

Research on the application of gaze visualization interface on virtual reality training systems

  • Original Paper
  • Published:
Journal on Multimodal User Interfaces Aims and scope Submit manuscript

Abstract

Although virtual reality (VR) training provides a realistic environment that offers an immersive experience for learners, it can affect the training progress by triggering a cognitive load or distracting learners from focusing on the main content of the training. In this study, the application of a gaze visualization interface to a medical safety training system was proposed to prevent learner distraction, which is an issue usually experienced in traditional VR training systems. An experiment was conducted to analyze the effects of applying the gaze visualization interface to a medical training system on the learning performance. Based on related studies, the following three metrics were selected to evaluate the learning performance: learning flow, achievement, and concentration. After the experiment, flow and achievement were evaluated through surveys and quizzes. These evaluation methods can also distort learners’ memories, making it difficult to analyze objective learning outcomes. Eye-tracking technology is useful for measuring learning performance in multimedia learning environments and can represent learner’s eye movements in objective metrics. Therefore, in this study, eye-tracking technology was used to evaluate concentration. The experimental results suggest that the gaze visualization interface had positive effects on learners’ concentration, learning flow, and achievement.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

References

  1. Holding DH (1981) Human skills. Wiley, New York

    Google Scholar 

  2. Blascovich J, Bailenson J (2011) Infinite reality: avatars, eternal life, new worlds, and the dawn of the virtual revolution. William Morrow & Co, New York

    Google Scholar 

  3. Merchant Z, Goetz ET, Cifuentes L, Keeney-Kennicutt W, Davis TJ (2014) Effectiveness of virtual reality-based instruction on students’ learning outcomes in K-12 and higher education: a meta-analysis. Comput Educ 70:29–40. https://doi.org/10.1016/j.compedu.2013.07.033

    Article  Google Scholar 

  4. Salas E, Bowers CA, Rhodenizer L (1998) It is not how much you have but how you use it: toward a rational use of simulation to support aviation training. Int J Aviat Psychol 8(3):197–208. https://doi.org/10.1207/s15327108ijap0803_2

    Article  Google Scholar 

  5. Alaraj A, Lemole MG, Finkle JH, Yudkowsky R, Wallace A, Luciano C, Banerjee PP, Rizzi SH, Charbel FT (2011) Virtual reality training in neurosurgery: review of current status and future applications. Surg Neurol Int 2(52):52. https://doi.org/10.4103/2152-7806.80117

    Article  Google Scholar 

  6. Larsen CR, Oestergaard J, Ottesen BS, Soerensen JL (2012) The efficacy of virtual reality simulation training in laparoscopy: a systematic review of randomized trials. Acta Obstet Gynecol Scand 91(9):1015–1028. https://doi.org/10.1111/j.1600-0412.2012.01482.x

    Article  Google Scholar 

  7. Çaliskan O (2011) Virtual field trips in education of earth and environmental sciences. Procedia Soc Behav Sci 15:3239–3243. https://doi.org/10.1016/j.sbspro.2011.04.278

    Article  Google Scholar 

  8. Rahman Y, Asish SM, Fisher NP, Bruce EC, Kulshreshth AK, Borst CW (2020) Exploring eye gaze visualization techniques for identifying distracted students in educational VR. In: 2020 IEEE conference on virtual reality and 3D user interfaces (VR). IEEE Publications, pp 868–877. https://doi.org/10.1109/VR46266.2020.00009

  9. Borst CW, Lipari NG, Woodworth JW (2018) Teacher-guided educational vr: assessment of live and prerecorded teachers guiding virtual field trips. In: 2018 IEEE conference on virtual reality and 3D user interfaces (VR). IEEE Publications, pp 467–474. https://doi.org/10.1109/VR.2018.8448286

  10. Makransky G, Terkildsen TS, Mayer RE (2019) Adding immersive virtual reality to a science lab simulation causes more presence but less learning. Learn Instr 60:225–236. https://doi.org/10.1016/j.learninstruc.2017.12.007

    Article  Google Scholar 

  11. Jing A, May K, Lee G, Billinghurst M (2021) Eye see what you see: exploring how bi-directional augmented reality gaze visualisation influences co-located symmetric collaboration. Front Virtual Real 2:79

    Article  Google Scholar 

  12. Holmqvist K, Nyström M, Andersson R, Dewhurst R, Jarodzka H, Van de Weijer J (2011) Eye tracking: a comprehensive guide to methods and measures. Oxford University Press, Oxford

    Google Scholar 

  13. Gog TV, Jarodzka H (2013) Eye tracking as a tool to study and enhance cognitive and metacognitive processes in computer-based learning environments. In: Azevedo R, Aleven V (eds) International handbook of metacognition and learning technologies. Springer, New York, pp 143–156

    Google Scholar 

  14. Rosch JL, Vogel-Walcutt JJ (2013) A review of eye-tracking applications as tools for training. Cogn Technol Work 15(3):313–327. https://doi.org/10.1007/s10111-012-0234-7

    Article  Google Scholar 

  15. Negi S, Mitra R (2020) Fixation duration and the learning process: an eye tracking study with subtitled videos. J Eye Mov Res 13(6):1–15. https://doi.org/10.16910/jemr.13.6.1

    Article  Google Scholar 

  16. Ke F, Lee S, Xu X (2016) Teaching training in a mixed-reality integrated learning environment. Comput Hum Behav 62:212–220. https://doi.org/10.1016/j.chb.2016.03.094

    Article  Google Scholar 

  17. Chang KE, Chang CT, Hou HT, Sung YT, Chao HL, Lee CM (2014) Development and behavioral pattern analysis of a mobile guide system with augmented reality for painting appreciation instruction in an art museum. Comput Educ 71:185–197. https://doi.org/10.1016/j.compedu.2013.09.022

    Article  Google Scholar 

  18. Stellmach S, Nacke L, Dachselt R (2010) Advanced gaze visualizations for three-dimensional virtual environments. In: Proceedings of the 2010 symposium on eye-tracking research & applications, pp 109–112. https://doi.org/10.1145/1743666.1743693

  19. Lee GA, Kim S, Lee Y, Dey A, Piumsomboon T, Norman M, Billinghurst M (2017) Improving collaboration in augmented video conference using mutually shared gaze. In: International conference on artificial reality and telexistence, pp 197–204

  20. Vann SW, Tawfik AA (2020) Flow theory and learning experience design in gamified learning environments. Learner and user experience research

  21. Jackson SA, Marsh HW (1996) Development and validation of a scale to measure optimal experience: the Flow State Scale. J Sport Exer Psychol 18(1):17–35. https://doi.org/10.1123/jsep.18.1.17

    Article  Google Scholar 

  22. Krathwohl DR (2002) A revision of Bloom’s taxonomy: an overview. Theor Pract 41(4):212–218. https://doi.org/10.1207/s15430421tip4104_2

    Article  Google Scholar 

  23. Kiili K, Ketamo H, Kickmeier-Rust MD (2014) Evaluating the usefulness of eye tracking in game-based learning. Int J Serious Games 1(2):51–65. https://doi.org/10.17083/ijsg.v1i2.15

    Article  Google Scholar 

  24. Albert B, Tullis T (2013) Measuring the user experience: collecting, analyzing, and presenting usability metrics. Newnes, Oxford

    Google Scholar 

  25. Liu PL (2014) Using eye tracking to understand learners’ reading process through the concept-mapping learning strategy. Comput Educ 78:237–249. https://doi.org/10.1016/j.compedu.2014.05.011

    Article  Google Scholar 

  26. Mitra R, McNeal KS, Bondell HD (2017) Pupillary response to complex interdependent tasks: a cognitive-load theory perspective. Behav Res Methods 49(5):1905–1919. https://doi.org/10.3758/s13428-016-0833-y

    Article  Google Scholar 

  27. Korea Institute for Healthcare Accreditation and Central Patient Safety Center (2022) Operating room patient safety practice guidelines. Korea Patient Safety Reporting & Learning System. https://www.kops.or.kr/portal/ifm/infoProvdStdrDetail.do?infoProvdNo=30&searchInfoProvdSe=RLM311001. Accessed 16 Oct 2022

  28. Semsar A, McGowan H, Feng Y, Zahiri HR, Park A, Kleinsmith, A, Mentis H (2019) How trainees use the information from telepointers in remote instruction. In: Proceedings of the ACM on human-computer interaction, pp 1–20. https://doi.org/10.1145/3359195

  29. D’Angelo S, Gergle D (2016) Gazed and confused: understanding and designing shared gaze for remote collaboration. In: Proceedings of the 2016 chi conference on human factors in computing systems, pp 2492–2496. https://doi.org/10.1145/2858036.2858499

  30. Adhanom IB, MacNeilage P, Folmer E (2023) Eye tracking in virtual reality: a broad review of applications and challenges. Virtual Real. https://doi.org/10.1007/s10055-022-00738-z

    Article  Google Scholar 

  31. Schuetz I, Fiehler K (2022) Eye tracking in virtual reality: vive pro eye spatial accuracy, precision, and calibration reliability. J Eye Mov Res 15(3):1–18. https://doi.org/10.16910/jemr.15.3.3

    Article  Google Scholar 

  32. Zhao S, Cheng S, Zhu C (2023) 3D gaze vis: sharing eye tracking data visualization for collaborative work in VR environment, pp 1–11. arXiv:2303.10635. https://doi.org/10.48550/arXiv.2303.10635

Download references

Acknowledgements

This research was supported by the Institute of Information and Communications Technology Planning and Evaluation (IITP) grant funded by the Korean government (MSIT) (No.2021-0-00986, Development of Interaction Technology to Maximize Realization of Virtual Reality Contents using Multimodal Sensory Interface) and funded by the rising professor Financial Program at Changwon National University in 2023.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Sanghun Nam.

Ethics declarations

Conflict of interest

The authors declare that they have no conflicts of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Choi, H., Kwon, J. & Nam, S. Research on the application of gaze visualization interface on virtual reality training systems. J Multimodal User Interfaces 17, 203–211 (2023). https://doi.org/10.1007/s12193-023-00409-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12193-023-00409-6

Keywords

Navigation