Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/1344471.1344521acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
research-article

Noise tolerant selection by gaze-controlled pan and zoom in 3D

Published: 26 March 2008 Publication History

Abstract

This paper presents StarGazer - a new 3D interface for gaze-based interaction and target selection using continuous pan and zoom. Through StarGazer we address the issues of interacting with graph structured data and applications (i.e. gaze typing systems) using low resolution eye trackers or small-size displays. We show that it is possible to make robust selection even with a large number of selectable items on the screen and noisy gaze trackers. A test with 48 subjects demonstrated that users who have never tried gaze interaction before could rapidly adapt to the navigation principles of StarGazer. We tested three different display sizes (down to PDA-sized displays) and found that large screens are faster to navigate than small displays and that the error rate is higher for the smallest display. Half of the subjects were exposed to severe noise deliberately added on the cursor positions. We found that this had a negative impact on efficiency. However, the user remained in control and the noise did not seem to effect the error rate. Additionally, three subjects tested the effects of temporally adding noise to simulate latency in the gaze tracker. Even with a significant latency (about 200 ms) the subjects were able to type at acceptable rates. In a second test, seven subjects were allowed to adjust the zooming speed themselves. They achieved typing rates of more than eight words per minute without using language modeling. We conclude that the StarGazer application is an intuitive 3D interface for gaze navigation, allowing more selectable objects to be displayed on the screen than the accuracy of the gaze trackers would otherwise permit.

References

[1]
Ashmore, M., Duchowski, A. T., and Shoemaker, G. 2005. Efficient eye pointing with a fisheye lens. In GI '05: Proceedings of Graphics Interface 2005. Canadian Human-Computer Communications Society, School of Computer Science. University of Waterloo, Waterloo, Ontario, Canada, 203--210.
[2]
Bates, R., and Istance, H. 2002. Zooming interfaces!: enhancing the performance of eye controlled pointing devices. In Assets '02: Proceedings of the fifth international ACM conference on Assistive technologies, ACM, New York, NY, USA, 119--126.
[3]
Bates, R., Istance, H., Donegan, M., and Oosthuizen, L. 2005. Fly where you look: Enhancing gaze based interaction in 3d environments. In HCI International- Universal Access in HCI Exploring New Interaction Environments, Caesars Palace, vol. 7.
[4]
Bolt, R. A. 1981. Gaze-orchestrated dynamic windows. In SIGGRAPH '81: Proceedings of the 8th annual conference on Computer graphics and interactive techniques, ACM, New York, NY, USA, 109--119.
[5]
Duchowski, A. T. 2007. Eye Tracking Methodology: Theory and Practice. Springer-Verlag New York, Inc.
[6]
Fono, D., and Vertegaal, R. 2005. Eyewindows: evaluation of eye-controlled zooming windows for focus selection. In CHI '05: Proceedings of the SIGCHI conference on Human factors in computing systems, ACM, New York, NY, USA, 151--160.
[7]
Guestrin, E. D., and Eizenman, M. 2006. General theory of remote gaze estimation using the pupil center and corneal reflections. IEEE Transactions on Biomedical Engineering 53, 6, 1124--1133.
[8]
Hansen, D. W., and Pece, A. E. 2005. Eye tracking in the wild. Computer Vision and Image Understanding 98, 1 (April), 182--210.
[9]
Hansen, J. P., Tørning, K., Johansen, A. S., Itoh, K., and Aoki, H. 2004. Gaze typing compared with input by head and hand. In ETRA '04: Proceedings of the 2004 symposium on Eye tracking research & applications, ACM, New York, NY, USA, 131--138.
[10]
Jacob, R. J. K. 1991. The use of eye movements in human-computer interaction techniques: what you look at is what you get. ACM Trans. Inf. Syst. 9, 2, 152--169.
[11]
Jordan, P. W. 1999. An Introduction to Usability. Taylor & Francis, London.
[12]
Kurtenbach, G., and Buxton, W. 1993. The limits of expert performance using hierarchic marking menus. In CHI '93: Proceedings of the INTERACT '93 and CHI '93 conference on Human factors in computing systems, ACM Press, New York, NY, USA, 482--487.
[13]
Lankford, C. 2000. Effective eye-gaze input into windows. In ETRA '00: Proceedings of the 2000 symposium on Eye tracking research & applications, ACM, New York, NY, USA, 23--27.
[14]
Li, D., Babcock, J., and Parkhurst, D. J. 2006. openeyes: a low-cost head-mounted eye-tracking solution. In ETRA '06: Proceedings of the 2006 symposium on Eye tracking research & applications, ACM Press, New York, NY, USA, 95--100.
[15]
MacKenzie, I. S., and Soukoreff, R. W. 2002. A character-level error analysis technique for evaluating text entry methods. In NordiCHI '02: Proceedings of the second Nordic conference on Human-computer interaction, ACM, New York, NY, USA, 243--246.
[16]
Majaranta, P., and Räihä, K.-J. 2002. Twenty years of eye typing: systems and design issues. In ETRA '02: Proceedings of the symposium on Eye tracking research & applications, ACM Press, New York, NY, USA, 15--22.
[17]
Majaranta, P., MacKenzie, S., Aula, A., and Räihä, K.-J. 2006. Effects of feedback and dwell time on eye typing speed and accuracy. Universal Access in the Information Society 5, 2, 199--208.
[18]
Miniotas, D., Špakov, O., and MacKenzie, I. S. 2004. Eye gaze interaction with expanding targets. In CHI '04: CHI '04 extended abstracts on Human factors in computing systems, ACM, New York, NY, USA, 1255--1258.
[19]
Ohno, T., and Mukawa, N. 2004. A free-head, simple calibration, gaze tracking system that enables gaze-based interaction. In Eye Tracking Research & Applications Symposium 2004, 115--122.
[20]
Sarkar, M., Snibbe, S. S., Tversky, O. J., and Reiss, S. P. 1993. Stretching the rubber sheet: a metaphor for viewing large layouts on small screens. In UIST '93: Proceedings of the 6th annual ACM symposium on User interface software and technology, ACM, New York, NY, USA, 81--91.
[21]
Schneiderman, B. 1992. Designing the User Interface: Strategies for Effective Human- Computer Interaction. Addison-Wesley Publishing Company.
[22]
Shih, S.-W., Wu, Y.-T., and Liu, J. 2000. A calibration-free gaze tracking technique. In Proceedings of the 15th International Conference on Pattern Recognition, 201--204.
[23]
StarGazer, 2007. www.youtube.com/watch?v=5iermrjnp50.
[24]
Urbina, M. H., and Huckauf, A. 2007. Dwell time free eye typing approaches. In Proceedings of COGAIN Gaze-based Creativity, Interacting with Games and On-line Communities, 65--70.
[25]
Villanueva, A., Cabeza, R., and Porta, S. 2006. Eye tracking: Pupil orientation geometrical modeling. Image and Vision Computing 24, 7 (July), 663--679.
[26]
Ward, D. J., and MacKay, D. J. C. 2002. Fast hands-free writing by gaze direction. Nature 418, 6900, 838.

Cited By

View all
  • (2024)GazeSwitch: Automatic Eye-Head Mode Switching for Optimised Hands-Free PointingProceedings of the ACM on Human-Computer Interaction10.1145/36556018:ETRA(1-20)Online publication date: 28-May-2024
  • (2024)40 Years of Eye Typing: Challenges, Gaps, and Emergent StrategiesProceedings of the ACM on Human-Computer Interaction10.1145/36555968:ETRA(1-19)Online publication date: 28-May-2024
  • (2024)SkiMR: Dwell-free Eye Typing in Mixed Reality2024 IEEE Conference Virtual Reality and 3D User Interfaces (VR)10.1109/VR58804.2024.00065(439-449)Online publication date: 16-Mar-2024
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
ETRA '08: Proceedings of the 2008 symposium on Eye tracking research & applications
March 2008
285 pages
ISBN:9781595939821
DOI:10.1145/1344471
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 26 March 2008

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. 3D interfaces
  2. alternative communication
  3. assistive technology
  4. computer input devices
  5. eye tracking
  6. eye typing
  7. gaze interaction
  8. mobile displays
  9. zooming

Qualifiers

  • Research-article

Conference

ETRA '08
ETRA '08: Eye Tracking Research and Applications
March 26 - 28, 2008
Georgia, Savannah

Acceptance Rates

Overall Acceptance Rate 69 of 137 submissions, 50%

Upcoming Conference

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)27
  • Downloads (Last 6 weeks)5
Reflects downloads up to 03 Oct 2024

Other Metrics

Citations

Cited By

View all
  • (2024)GazeSwitch: Automatic Eye-Head Mode Switching for Optimised Hands-Free PointingProceedings of the ACM on Human-Computer Interaction10.1145/36556018:ETRA(1-20)Online publication date: 28-May-2024
  • (2024)40 Years of Eye Typing: Challenges, Gaps, and Emergent StrategiesProceedings of the ACM on Human-Computer Interaction10.1145/36555968:ETRA(1-19)Online publication date: 28-May-2024
  • (2024)SkiMR: Dwell-free Eye Typing in Mixed Reality2024 IEEE Conference Virtual Reality and 3D User Interfaces (VR)10.1109/VR58804.2024.00065(439-449)Online publication date: 16-Mar-2024
  • (2024)The impact of visual and motor space size on gaze-based target selectionAustralian Journal of Psychology10.1080/00049530.2024.230938476:1Online publication date: 5-Feb-2024
  • (2024)LeyenesInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2023.103204184:COnline publication date: 1-Apr-2024
  • (2023)GRACE: Online Gesture Recognition for Autonomous Camera-Motion Enhancement in Robot-Assisted SurgeryIEEE Robotics and Automation Letters10.1109/LRA.2023.33266908:12(8263-8270)Online publication date: Dec-2023
  • (2023)Enhancing Hybrid Eye Typing Interfaces with Word and Letter Prediction: A Comprehensive EvaluationInternational Journal of Human–Computer Interaction10.1080/10447318.2023.2297113(1-13)Online publication date: 28-Dec-2023
  • (2022)Performance Analysis of Saccades for Primary and Confirmatory Target SelectionProceedings of the 28th ACM Symposium on Virtual Reality Software and Technology10.1145/3562939.3565619(1-12)Online publication date: 29-Nov-2022
  • (2022)Usability of the super-vowel for gaze-based text entry2022 Symposium on Eye Tracking Research and Applications10.1145/3517031.3529231(1-5)Online publication date: 8-Jun-2022
  • (2022) SPEye: A Calibration-Free Gaze-Driven Text Entry Technique Based on Smooth Pursuit IEEE Transactions on Human-Machine Systems10.1109/THMS.2021.312320252:2(312-323)Online publication date: Apr-2022
  • Show More Cited By

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media