Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3131277.3132182acmconferencesArticle/Chapter ViewAbstractPublication PagessuiConference Proceedingsconference-collections
research-article

The eyes don't have it: an empirical comparison of head-based and eye-based selection in virtual reality

Published: 16 October 2017 Publication History

Abstract

We present a study comparing selection performance between three eye/head interaction techniques using the recently released FOVE head-mounted display (HMD). The FOVE offers an integrated eye tracker, which we use as an alternative to potentially fatiguing and uncomfortable head-based selection used with other commercial devices. Our experiment was modelled after the ISO 9241-9 reciprocal selection task, with targets presented at varying depths in a custom virtual environment. We compared eye-based selection, and head-based selection (i.e., gaze direction) in isolation, and a third condition which used both eye-tracking and head-tracking at once. Results indicate that eye-only selection offered the worst performance in terms of error rate, selection times, and throughput. Head-only selection offered significantly better performance.

Supplementary Material

MP4 File (p91-qian.mp4)

References

[1]
Bolt, R. A. (1981, August). Gaze-orchestrated dynamic windows. In ACM SIGGRAPH Computer Graphics (Vol. 15, No. 3, pp. 109--119). ACM.
[2]
Bolt, R. A. (1982). Eyes at the interface. In Proceedings of the ACM Conference on Human Factors in Computing Systems - CHI '82 (pp. 360--362). ACM.
[3]
Bolt, R. A. (1990). A gaze-responsive self-disclosing display. In Proceedings of the ACM Conference on Human Factors in Computing Systems - CHI '90 (pp. 3--10). ACM.
[4]
Bowman, D. A., & Hodges, L. F. (1997). An evaluation of techniques for grabbing and manipulating remote objects in immersive virtual environments. In Proceedings of the ACM Symposium on Interactive 3D Graphics (pp. 35--38). ACM.
[5]
Bowman, D. A., Kruijff, E., LaViola Jr, J. J., & Poupyrev, I. (2001). An introduction to 3-D user interface design. Presence: Teleoperators and Virtual Environments, 10(1), 96--108.
[6]
Brown, M. A., & Stuerzlinger, W. (2014, May). The performance of uninstrumented in-air pointing. In Proceedings of Graphics Interface 2014 (pp. 59--66). Canadian Information Processing Society.
[7]
Duchowski, A. T., Medlin, E., Gramopadhye, A., Melloy, B., & Nair, S. (2001). Binocular eye tracking in VR for visual inspection training. In Proceedings of the ACM Symposium on Virtual Reality Software and Technology - VRST '11, (pp. 1--8). ACM.
[8]
Essig, K., Dornbusch, D., Prinzhorn, D., Ritter, H., Maycock, J., & Schack, T. (2012). Automatic analysis of 3D gaze coordinates on scene objects using data from eye-tracking and motion-capture systems. In Proceedings of the Symposium on Eye Tracking Research and Applications - ETRA '12, (pp. 37--44). ACM.
[9]
Fitts, P. M. (1954). The information capacity of the human motor system in controlling the amplitude of movement. Journal of Experimental Psychology, 47(6), 381.
[10]
Foley, J. D., Wallace, V. L., & Chan, P. (1984). The human factors of computer graphics interaction techniques. IEEE Computer Graphics and Applications, 4(11), 13--48.
[11]
Fono, D., & Vertegaal, R. (2005). EyeWindows: evaluation of eye-controlled zooming windows for focus selection. In Proceedings of the ACM Conference on Human Factors in Computing Systems - CHI '05, (pp. 151--160). ACM.
[12]
ISO, Ergonomic requirements for office work with visual display terminals (VDTs) - Part 9: Requirements for non-keyboard input devices (ISO 9241-9), International Organisation for Standardisation. Report Number ISO/TC 159/SC4/WG3 N147, February 15, 2000.
[13]
Jacob, R. J. (1990). What you look at is what you get: eye movement-based interaction techniques. In Proceedings of the ACM Conference on Human Factors in Computing Systems - CHI '90, (pp. 11--18). ACM.
[14]
Kopper, R., Bowman, D. A., Silva, M. G., & McMahan, R. P. (2010). A human motor behavior model for distal pointing tasks. International Journal of Human-Computer Studies, 68(10), 603--615.
[15]
Lanman, J., Bizzi, E., & Allum, J. (1978). The coordination of eye and head movement during smooth pursuit. Brain Research, 153(1), 39--53.
[16]
Lee, S., Seo, J., Kim, G. J., & Park, C. M. (2003). Evaluation of pointing techniques for ray casting selection in virtual environments. In the International Conference on Virtual Reality and its Application in Industry (Vol. 4756, No. 1, pp. 38--44).
[17]
Levine, J. L. (1981). An eye-controlled computer. IBM Research Division, TJ Watson Research Center.
[18]
MacKenzie, I. S. (2011). Evaluating eye tracking systems for computer input. Gaze Interaction and Applications of Eye Tracking: Advances in Assistive Technologies, 205.
[19]
MacKenzie, I. S., & Teather, R. J. (2012). FittsTilt: the application of Fitts' law to tilt-based interaction. In Proceedings of the ACM Nordic Conference on Human-Computer Interaction - NORDICHI '12, (pp. 568--577). ACM.
[20]
Mine, M. R. (1995). Virtual environment interaction techniques. UNC Chapel Hill CS Dept.
[21]
Ohshima, T., Yamamoto, H., & Tamura, H. (1996). Gaze-directed adaptive rendering for interacting with virtual space. In Proceedings of the IEEE Virtual Reality Annual International Symposium, (pp. 103--110). IEEE.
[22]
Pierce, J. S., Forsberg, A. S., Conway, M. J., Hong, S., Zeleznik, R. C., & Mine, M. R. (1997). Image plane interaction techniques in 3D immersive environments. In Proceedings of the Symposium on Interactive 3D Graphics, (pp. 39--43). ACM.
[23]
Poupyrev, I., Billinghurst, M., Weghorst, S., & Ichikawa, T. (1996). The go-go interaction technique: non-linear mapping for direct manipulation in VR. In Proceedings of the ACM Symposium on User Interface Software and Technology - UIST '96, (pp. 79--80). ACM.
[24]
Poupyrev, I., Weghorst, S., Billinghurst, M., & Ichikawa, T. (1997). A framework and testbed for studying manipulation techniques for immersive VR. In Proceedings of the ACM Symposium on Virtual Reality Software and Technology - VRST '97 (pp. 21--28). ACM.
[25]
Sibert, L. E., & Jacob, R. J. (2000). Evaluation of eye gaze interaction. In Proceedings of the ACM Conference on Human Factors in Computing Systems - CHI 2000, (pp. 281--288). ACM.
[26]
Soukoreff, R. W., & MacKenzie, I. S. (2004). Towards a standard for pointing device evaluation, perspectives on 27 years of Fitts' law research in HCI. International Journal of Human-Computer Studies, 61(6), 751--789.
[27]
Stoelen, M. F., & Akin, D. L. (2010). Assessment of Fitts' law for quantifying combined rotational and translational movements. Human Factors, 52(1), 63--77.
[28]
Steptoe, W., Wolff, R., Murgia, A., Guimaraes, E., Rae, J., Sharkey, P., Roberts, D., and Steed, A. (2008). Eye-tracking for avatar eye-gaze and interactional analysis in immersive collaborative virtual environments. In Proceedings of the ACM Conference on Computer Supported Cooperative Work - CSCW '08, (pp. 197--200). ACM, 2008.
[29]
Teather, R. J., & Stuerzlinger, W. (2011). Pointing at 3D targets in a stereo head-tracked virtual environment. In Proceedings of the IEEE Symposium on 3D User Interfaces - 3DUI '11, (pp. 87--94). IEEE.
[30]
Teather, R. J., & Stuerzlinger, W. (2013). Pointing at 3D target projections with one-eyed and stereo cursors. In Proceedings of the ACM Conference on Human Factors in Computing Systems - CHI '13, (pp. 159--168). ACM.
[31]
Vanacken, L., Grossman, T., & Coninx, K. (2007). Exploring the effects of environment density and target visibility on object selection in 3D virtual environments. In Proceedings of the IEEE Symposium on 3D User Interfaces - 3DUI '07, (pp. 117--124). IEEE.
[32]
Vertegaal, R. (2008). A Fitts law comparison of eye tracking and manual input in the selection of visual targets. In Proceedings of the International Conference on Multimodal Interfaces, (pp. 241--248). ACM.
[33]
Zhai, S., & Milgram, P. (1994). Input techniques for HCI in 3D environments. In Conference Companion on Human Factors in Computing Systems (pp. 85--86). ACM.
[34]
Zhang, X., & MacKenzie, I. (2007). Evaluating eye tracking with ISO 9241-part 9. Human-Computer Interaction. HCI Intelligent Multimodal Interaction Environments, (pp. 779--788).

Cited By

View all
  • (2025)SF-SAM-Adapter: SAM-based segmentation model integrates prior knowledge for gaze image reflection noise removalAlexandria Engineering Journal10.1016/j.aej.2024.10.092111(521-529)Online publication date: Jan-2025
  • (2024)PTVR – A software in Python to make virtual reality experiments easier to build and more reproducibleJournal of Vision10.1167/jov.24.4.1924:4(19)Online publication date: 23-Apr-2024
  • (2024)Evaluation of Retrieval Techniques for Out-of-Range VR Objects, Contrasting Controller-Based and Free-Hand InteractionProceedings of the 2024 ACM Symposium on Spatial User Interaction10.1145/3677386.3682090(1-11)Online publication date: 7-Oct-2024
  • Show More Cited By

Index Terms

  1. The eyes don't have it: an empirical comparison of head-based and eye-based selection in virtual reality

      Recommendations

      Comments

      Please enable JavaScript to view thecomments powered by Disqus.

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      SUI '17: Proceedings of the 5th Symposium on Spatial User Interaction
      October 2017
      167 pages
      ISBN:9781450354868
      DOI:10.1145/3131277
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 16 October 2017

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. ISO 9241-9
      2. eye-tracking
      3. fitt's law
      4. head-mounted display
      5. selection performance

      Qualifiers

      • Research-article

      Funding Sources

      • Natural Sciences and Engineering Research Council of Canada (NSERC)

      Conference

      SUI '17
      Sponsor:
      SUI '17: Symposium on Spatial User Interaction
      October 16 - 17, 2017
      Brighton, United Kingdom

      Acceptance Rates

      Overall Acceptance Rate 86 of 279 submissions, 31%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)257
      • Downloads (Last 6 weeks)39
      Reflects downloads up to 19 Nov 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2025)SF-SAM-Adapter: SAM-based segmentation model integrates prior knowledge for gaze image reflection noise removalAlexandria Engineering Journal10.1016/j.aej.2024.10.092111(521-529)Online publication date: Jan-2025
      • (2024)PTVR – A software in Python to make virtual reality experiments easier to build and more reproducibleJournal of Vision10.1167/jov.24.4.1924:4(19)Online publication date: 23-Apr-2024
      • (2024)Evaluation of Retrieval Techniques for Out-of-Range VR Objects, Contrasting Controller-Based and Free-Hand InteractionProceedings of the 2024 ACM Symposium on Spatial User Interaction10.1145/3677386.3682090(1-11)Online publication date: 7-Oct-2024
      • (2024)Exploring the Impact of Display Field-of-view and Display Type on Text Readability and Visual Text Search on Large Displays in Virtual Reality: Impact of Display Field-of-view and Type on Text Readability and Search in Large VR DisplaysProceedings of the 50th Graphics Interface Conference10.1145/3670947.3670984(1-11)Online publication date: 3-Jun-2024
      • (2024)Virtual Task Environments Factors Explored in 3D Selection StudiesProceedings of the 50th Graphics Interface Conference10.1145/3670947.3670983(1-16)Online publication date: 3-Jun-2024
      • (2024)The Effect of Degraded Eye Tracking Accuracy on Interactions in VRProceedings of the 2024 Symposium on Eye Tracking Research and Applications10.1145/3649902.3656369(1-7)Online publication date: 4-Jun-2024
      • (2024)Hand Me This: Exploring the Effects of Gaze-driven Animations and Hand Representations in Users’ Sense of Presence and EmbodimentProceedings of the 2024 Symposium on Eye Tracking Research and Applications10.1145/3649902.3656362(1-7)Online publication date: 4-Jun-2024
      • (2024)Body Language for VUIs: Exploring Gestures to Enhance Interactions with Voice User InterfacesProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3660691(133-150)Online publication date: 1-Jul-2024
      • (2024)Exploration of Foot-based Text Entry Techniques for Virtual Reality EnvironmentsProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642757(1-17)Online publication date: 11-May-2024
      • (2024)Cone&Bubble: Evaluating Combinations of Gaze, Head and Hand Pointing for Target Selection in Dense 3D Environments2024 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)10.1109/VRW62533.2024.00126(642-649)Online publication date: 16-Mar-2024
      • Show More Cited By

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media