Abstract
Conducting collaborative tasks, e.g., multi-user game, in virtual reality (VR) could enable us to explore more immersive and effective experience. However, for current VR systems, users cannot communicate properly with each other via their gaze points, and this would interfere with users’ mutual understanding of the intention. In this study, we aimed to find the optimal eye tracking data visualization, which minimized the cognitive interference and improved the understanding of the visual attention and intention between users. We designed three different eye tracking data visualizations: gaze cursor, gaze spotlight and gaze trajectory in VR scene for a course of human heart, and found that gaze cursor from doctors could help students learn complex 3D heart models more effectively. To further explore, two students as a pair were asked to finish a quiz in VR environment, with sharing gaze cursors with each other, and obtained more efficiency and scores. It indicated that sharing eye tracking data visualization could improve the quality and efficiency of collaborative work in the VR environment.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Špakov, O., et al. Eye gaze and head gaze in collaborative games. In: Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications, vol. 85, pp. 1–9 (2019)
Wang, H., Shi, B.E.: Gaze awareness improves collaboration efficiency in a collaborative assembly task. In: Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications, vol. 85, pp. 1–9 (2019)
Newn, J., Allison, F., Velloso, E., et al.: Looks can be deceiving: Using gaze visualisation to predict and mislead opponents in strategic gameplay. In: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, vol. 261, pp. 1–12 (2018)
D’Angelo, S., Gergle, D.: An eye for design: gaze visualizations for remote collaborative work. In: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, vol. 349, pp. 1–12 (2018)
Kevin, S., Pai, Y.S., Kunze, K.: Virtual gaze: exploring use of gaze as rich interaction method with virtual agent in interactive virtual reality content. In: Proceedings of the 24th ACM Symposium on Virtual Reality Software and Technology, vol. 130, pp. 1–2 (2018)
Boyd, L.A.E., Gupta, S., Vikmani, S.B., et al.: vrSocial: toward immersive therapeutic VR systems for children with autism. In: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, vol. 204, pp. 1–12 (2018)
Heilmann, F., Witte, K.: Perception and action under different stimulus presentations: a review of eye-tracking studies with an extended view on possibilities of virtual reality. Appl. Sci. 11(12), 5546 (2021)
Llanes-Jurado, J., et al.: Development and calibration of an eye-tracking fixation identification algorithm for immersive virtual reality. Sensors 20(17), 4956 (2020)
Cheng, S., Ping, Q., Wang, J., Chen, Y.: EasyGaze: hybrid eye tracking approach for handheld mobile devices. Virt. Real. Intell. Hardw. 4(2), 173–188 (2022)
Cheng, S.W., Sun, Z.Q.: An approach to eye tracking for mobile device based interaction. J. Comput.-Aided Des. Comput. Graph. 26(8), 1354–1361 (2014). (in Chinese)
Mitkus, M., Olsson, P., Toomey, M.B., et al.: Specialized photoreceptor composition in the raptor fovea. J. Comparat. Neurol. 525(9), 2152–2163 (2017)
Neider, M.B., Chen, X., Dickinson, C.A., Brennan, S.E., Zelinsky, G.J.: Coordinating spatial referencing using shared gaze. Psychon. Bull. amp. 17, 718–724 (2010)
Shuang, Z.C.: Research on collaborative interaction based on eye movement in virtual reality environment. Zhejiang University of Technology, Hangzhou (2020)
Cheng, S.W., Zhu, C.S.: Prediction and assistance of navigation demand based on eye tracking in virtual reality environment. Comput. Sci. 48(8), 315–321 (2021)
Acknowledgement
The authors would like to thank all the volunteers who participated in the experiments. This work was supported in part by the National Natural Science Foundation of China under Grant 62172368, 61772468, and the Natural Science Foundation of Zhejiang Province under Grant LR22F020003.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Zhao, S., Cheng, S., Zhu, C. (2023). 3D Gaze Vis: Sharing Eye Tracking Data Visualization for Collaborative Work in VR Environment. In: Sun, Y., et al. Computer Supported Cooperative Work and Social Computing. ChineseCSCW 2022. Communications in Computer and Information Science, vol 1682. Springer, Singapore. https://doi.org/10.1007/978-981-99-2385-4_46
Download citation
DOI: https://doi.org/10.1007/978-981-99-2385-4_46
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-99-2384-7
Online ISBN: 978-981-99-2385-4
eBook Packages: Computer ScienceComputer Science (R0)