Abstract
Augmented reality (AR) technologies provide a shared platform for users to collaborate in a physical context involving both real and virtual content. To enhance the quality of interaction between AR users, researchers have proposed augmenting users’ interpersonal space with embodied cues such as their gaze direction. While beneficial in achieving improved interpersonal spatial communication, such shared gaze environments suffer from multiple types of errors related to eye tracking and networking, that can reduce objective performance and subjective experience. In this paper, we present a human-subjects study to understand the impact of accuracy, precision, latency, and dropout based errors on users’ performance when using shared gaze cues to identify a target among a crowd of people. We simulated varying amounts of errors and the target distances and measured participants’ objective performance through their response time and error rate, and their subjective experience and cognitive load through questionnaires. We found significant differences suggesting that the simulated error levels had stronger effects on participants’ performance than target distance with accuracy and latency having a high impact on participants’ error rate. We also observed that participants assessed their own performance as lower than it objectively was. We discuss implications for practical shared gaze applications and we present a multi-user prototype system.
Similar content being viewed by others
References
Barz M, Bulling A, Daiber F (2015) Computational modelling and prediction of gaze estimation error for head-mounted eye trackers. DFKI Res Rep 1(1):1–10
Bauer M, Kortuem G, Segall Z (1999) “Where are you pointing at?” A study of remote collaboration in a wearable videoconference system. In: Digest of papers. Third international symposium on wearable computers, pp 151–158. IEEE
Brennan SE, Chen X, Dickinson CA, Neider MB, Zelinsky GJ (2008) Coordinating cognition: the costs and benefits of shared gaze during collaborative search. Cognition 106(3):1465–1477
Brooke J (1996) SUS: a quick and dirty usability scale. Usabil Eval Ind 189(194):4–7
Cerrolaza JJ, Villanueva A, Villanueva M, Cabeza R (2012) Error characterization and compensation in eye tracking systems. In: Proceedings of the symposium on eye tracking research and applications, pp 205–208. ACM
Conner B, Holden L (1997) Providing a low latency user experience in a high latency application
Drewes J, Masson GS, Montagnini A (2012) Shifts in reported gaze position due to changes in pupil size: ground truth and compensation. In: Proceedings of the symposium on eye tracking research and applications, pp 209–212. ACM
Ellis SR, Breant F, Manges B, Jacoby R, Adelstein BD (1997) Factors influencing operator interaction with virtual objects viewed via head-mounted see-through displays: viewing conditions and rendering latency. In: Proceedings of IEEE annual international symposium on virtual reality, pp 138–145. IEEE
Erickson A, Norouzi N, Kim K, LaViola JJ Jr, Bruder G, Welch GF (2020) Understanding the effects of depth information in shared gaze augmented reality environments. In: IEEE transactions on visualization and computer graphics
Feit AM, Williams S, Toledo A, Paradiso A, Kulkarni H, Kane S, Morris MR (2017) Toward everyday gaze input: accuracy and precision of eye tracking and implications for design. In: Proceedings of the chi conference on human factors in computing systems, pp 1118–1130. ACM
Fitzpatrick K, Brewer MA, Turner S (2006) Another look at pedestrian walking speed. Transp Res Rec 1982(1):21–29
Geelhoed E, Parker A, Williams DJ, Groen M (2009) Effects of latency on telepresence. Technical report HPL-2009-120, HP Laboratories
Gupta K, Lee GA, Billinghurst M (2016) Do you see what I see? The effect of gaze tracking on task space remote collaboration. IEEE Trans Visual Comput Graph 22(11):2413–2422
Hall ET (1959) The silent language, vol 948. Anchor Books, New York
Hart SG, Staveland LE (1988) Development of NASA-TLX (task load index): results of empirical and theoretical research. In: Advances in psychology, vol. 52, pp 139–183. Elsevier, Amsterdam
Holmqvist K, Nyström M, Mulvey F (2012) Eye tracker data quality: what it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp 45–52. ACM
Hornof AJ, Halverson T (2002) Cleaning up systematic error in eye-tracking data by using required fixation locations. Behav Res Methods Instrum Comput 34(4):592–604
Jörg S, Normoyle A, Safonova A (2012) How responsiveness affects players’ perception in digital games. In: Proceedings of the ACM symposium on applied perception, pp 33–38. ACM
Kim K, Billinghurst M, Bruder G, Duh HBL, Welch GF (2018) Revisiting trends in augmented reality research: a review of the 2nd decade of ismar (2008–2017). IEEE Trans Visual Comput Graph (TVCG) 24(11):2947–2962
Kim K, Nagendran A, Bailenson J, Welch G (2015) Expectancy violations related to a virtual human’s joint gaze behavior in real-virtual human interactions. In: Proceedings of international conference on computer animation and social agents, pp 5–8
Kiyokawa K, Takemura H, Yokoya N (1999) A collaboration support technique by integrating a shared virtual reality and a shared augmented reality. In: IEEE proceedings of the international conference on systems, man, and cybernetics (Cat. No. 99CH37028), vol 6, pp 48–53. IEEE
Koilias A, Mousas C, Anagnostopoulos CN (2019) The effects of motion artifacts on self-avatar agency. Informatics 6(2):18
Langton SR, Watt RJ, Bruce V (2000) Do the eyes have it? Cues to the direction of social attention. Trends Cogn Sci 4(2):50–59
Lee C, Bonebrake S, Bowman DA, Höllerer T (2010) The role of latency in the validity of AR simulation. In: IEEE virtual reality conference (VR), pp 11–18
Li Y, Lu F, Lages WS, Bowman D (2019) Gaze direction visualization techniques for collaborative wide-area model-free augmented reality. In: Symposium on spatial user interaction, pp 1–11
Mcknight DH, Carter M, Thatcher JB, Clay PF (2011) Trust in a specific technology: an investigation of its components and measures. ACM Trans Manag Inf Syst 2(2):12
Murray N, Roberts D, Steed A, Sharkey P, Dickerson P, Rae J (2007) An assessment of eye-gaze potential within immersive virtual environments. ACM Trans Multimedia Comput Commun Appl 3(4):17
Norouzi N, Erickson A, Kim K, Schubert R, LaViola Jr, JJ, Bruder G, Welch GF (2019) Effects of shared gaze parameters on visual target identification task performance in augmented reality. In: Proceedings of the ACM symposium on spatial user interaction (SUI), pp 12:1–12:11
Nyström M, Andersson R, Holmqvist K, Van De Weijer J (2013) The influence of calibration method and eye physiology on eyetracking data quality. Behav Res Methods 45(1):272–288
Ooms K, Dupont L, Lapon L, Popelka S (2015) Accuracy and precision of fixation locations recorded with the low-cost eye tribe tracker in different experimental setups. J Eye Move Res 8(1):1–20
Pavlovych A, Stuerzlinger W (2011) Target following performance in the presence of latency, jitter, and signal dropouts. In: Proceedings of Graphics Interface. Canadian Human–Computer Communications Society, pp 33–40
Piumsomboon T, Day A, Ens B, Lee Y, Lee G, Billinghurst M (2017) Exploring enhancements for remote mixed reality collaboration. In: ACM SIGGRAPH Asia mobile graphics and interactive applications
Piumsomboon T, Lee Y, Lee G, Billinghurst M (2017) Covar: a collaborative virtual and augmented reality system for remote collaboration. In: SIGGRAPH Asia 2017 emerging technologies. ACM
Piumsomboon T, Lee Y, Lee GA, Dey A, Billinghurst M (2017) Empathic mixed reality: sharing what you feel and interacting with what you see. In: International symposium on ubiquitous virtual reality, pp 38–41. IEEE
Ragan E, Wilkes C, Bowman DA, Hollerer T (2009) Simulation of augmented reality systems in purely virtual environments. In: IEEE virtual reality conference, pp 287–288
Schoenenberg K (2016) The quality of mediated-conversations under transmission delay. Ph.D. thesis, TU Berlin
Steinicke F, Ropinski T, Hinrichs K (2006) Object selection in virtual environments using an improved virtual pointer metaphor. In: Computer vision and graphics. Springer, Berlin, pp 320–326
Toothman N, Neff M (2019) The impact of avatar tracking errors on user experience in VR. In: Proceedings of IEEE virtual reality (VR), pp 1–11
Velloso E, Carter M, Newn J, Esteves A, Clarke C, Gellersen H (2017) Motion correlation: selecting objects by matching their movement. ACM Trans Comput Hum Interact 24(3):35
Waltemate T, Senna I, Hülsmann F, Rohde M, Kopp S, Ernst M, Botsch M (2016) The impact of latency on perceptual judgments and motor performance in closed-loop interaction in virtual reality. In: Proceedings of the ACM conference on virtual reality software and technology, pp 27–35
Welch G, Bruder G, Squire P, Schubert R (2019) Anticipating widespread augmented reality: insights from the 2018 AR visioning workshop. Technical report, University of Central Florida and Office of Naval Research
Zhang Y, Pfeuffer K, Chong MK, Alexander J, Bulling A, Gellersen H (2017) Look together: using gaze for assisting co-located collaborative search. Pers Ubiquit Comput 21(1):173–186
Funding
This material includes work supported in part by the Office of Naval Research under Award Number N00014-17-1-2927 (Dr. Peter Squire, Code 34); the National Science Foundation under Collaborative Award Number 1800961 (Dr. Ephraim P. Glinert, IIS); and the AdventHealth Endowed Chair in Healthcare Simulation (Prof. Welch). Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the supporting institutions.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Erickson, A., Norouzi, N., Kim, K. et al. Sharing gaze rays for visual target identification tasks in collaborative augmented reality. J Multimodal User Interfaces 14, 353–371 (2020). https://doi.org/10.1007/s12193-020-00330-2
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12193-020-00330-2