Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3204493.3204536acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
research-article

Error-aware gaze-based interfaces for robust mobile gaze interaction

Published: 14 June 2018 Publication History

Abstract

Gaze estimation error can severely hamper usability and performance of mobile gaze-based interfaces given that the error varies constantly for different interaction positions. In this work, we explore error-aware gaze-based interfaces that estimate and adapt to gaze estimation error on-the-fly. We implement a sample error-aware user interface for gaze-based selection and different error compensation methods: a naïve approach that increases component size directly proportional to the absolute error, a recent model by Feit et al. that is based on the two-dimensional error distribution, and a novel predictive model that shifts gaze by a directional error estimate. We evaluate these models in a 12-participant user study and show that our predictive model significantly outperforms the others in terms of selection rate, particularly for small gaze targets. These results underline both the feasibility and potential of next generation error-aware gaze-based user interfaces.

References

[1]
Stanislavs Bardins, Tony Poitschke, and Stefan Kohlbecher. 2008. Gaze-based Interaction in Various Environments. In Proceedings of the 1st ACM Workshop on Vision Networks for Behavior Analysis (VNBA '08). ACM, New York, NY, USA, 47--54.
[2]
Michael Barz, Andreas Bulling, and Florian Daiber. 2016. Prediction of Gaze Estimation Error for Error-Aware Gaze-Based Interfaces. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA '16). ACM, New York, NY, USA.
[3]
Michael Barz, Peter Poller, and Daniel Sonntag. 2017. Evaluating Remote and Head-worn Eye Trackers in Multi-modal Speech-based HRI. In Proceedings of the Companion of the 2017 ACM/IEEE International Conference on Human-Robot Interaction, Bilge Mutlu, Manfred Tscheligi, Astrid Weiss, and James E Young (Eds.). ACM, New York, NY, USA, 79--80.
[4]
Michael Barz and Daniel Sonntag. 2016. Gaze-guided object classification using deep neural networks for attention-based computing. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing Adjunct - UbiComp '16. ACM Press, New York, New York, USA, 253--256.
[5]
Pieter Blignaut, Kenneth Holmqvist, Marcus Nyström, and Richard Dewhurst. 2014. Improving the Accuracy of Video-Based Eye-Tracking in Real-Time through Post-Calibration Regression. Springer, 77--100.
[6]
Pieter Blignaut and Daniël Wium. 2013. The Effect of Mapping Function on the Accuracy of a Video-based Eye Tracker. In Proceedings of the 2013 Conference on Eye Tracking South Africa (ETSA '13). ACM, New York, NY, USA, 39--46.
[7]
Jurek Breuninger, Christian Lange, and Klaus Bengler. 2011. Implementing Gaze Control for Peripheral Devices. In Proceedings of the 1st International Workshop on Pervasive Eye Tracking & Mobile Eye-based Interaction (PETMEI '11). ACM, New York, NY, USA, 3--8.
[8]
Andreas Bulling, Daniel Roggen, and Gerhard Tröster. 2008. EyeMote --- Towards Context-Aware Gaming Using Eye Movements Recorded from Wearable Electrooculography. In Proceedings of the 2Nd International Conference on Fun and Games. Springer-Verlag, Berlin, Heidelberg, 33--45.
[9]
Géry Casiez, Nicolas Roussel, and Daniel Vogel. 2012. 1 Euro Filter: A Simple Speed-based Low-pass Filter for Noisy Input in Interactive Systems. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '12). ACM, New York, NY, USA, 2527--2530.
[10]
Juan J. Cerrolaza, Arantxa Villanueva, Maria Villanueva, and Rafael Cabeza. 2012. Error Characterization and Compensation in Eye Tracking Systems. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA '12). ACM, New York, NY, USA, 205--208.
[11]
Jan Drewes, Guillaume S. Masson, and Anna Montagnini. 2012. Shifts in Reported Gaze Position Due to Changes in Pupil Size: Ground Truth and Compensation. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA '12). ACM, New York, NY, USA, 209--212.
[12]
Augusto Esteves, Eduardo Velloso, Andreas Bulling, and Hans Gellersen. 2015. Orbits. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology - UIST '15. ACM Press, New York, New York, USA, 457--466.
[13]
Anna Maria Feit, Shane Williams, Arturo Toledo, Ann Paradiso, Harish Kulkarni, Shaun Kane, and Meredith Ringel Morris. 2017. Toward Everyday Gaze Input: Accuracy and Precision of Eye Tracking and Implications for Design. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems - CHI '17. ACM Press, New York, New York, USA, 1118--1130.
[14]
Jeremy Hales, David Rozado, and Diako Mardanbegi. 2011. Interacting with Objects in the Environment by Gaze and Hand Gestures. In Proceedings of the 3rd International Workshop on Pervasive Eye Tracking and Mobile Eye-Based Interaction. 1--9.
[15]
Robert J. K. Jacob. 1991. The Use of Eye Movements in Human-computer Interaction Techniques: What You Look at is What You Get. ACM Trans. Inf. Syst. 9, 2 (April 1991), 152--169.
[16]
Samuel John, Erik Weitnauer, and Hendrik Koesling. 2012. Entropy-based Correction of Eye Tracking Data for Static Scenes. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA '12). ACM, New York, NY, USA, 297--300.
[17]
Moritz Kassner, William Patera, and Andreas Bulling. 2014. Pupil: An Open Source Platform for Pervasive Eye Tracking and Mobile Gaze-based Interaction. In Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication (UbiComp '14 Adjunct). ACM, New York, NY, USA, 1151--1160.
[18]
Christian Lander, Sven Gehring, Antonio Krüger, Sebastian Boring, and Andreas Bulling. 2015a. GazeProjector: Accurate Gaze Estimation and Seamless Gaze Interaction Across Multiple Displays. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology (UIST '15). ACM, New York, NY, USA, 395--404.
[19]
Christian Lander, Marco Speicher, Denise Paradowski, Norine Coenen, Sebastian Biewer, and Antonio Krüger. 2015b. Collaborative Newspaper: Exploring an Adaptive Scrolling Algorithm in a Multi-user Reading Scenario. In Proceedings of the 4th International Symposium on Pervasive Displays (PerDis '15). ACM, New York, NY, USA, 163--169.
[20]
Diako Mardanbegi and Dan Witzner Hansen. 2011. Mobile Gaze-based Screen Interaction in 3D Environments. In Proceedings of the 1st Conference on Novel Gaze-Controlled Applications (NGCA '11). ACM, New York, NY, USA, Article 2, 4 pages.
[21]
Diako Mardanbegi and Dan Witzner Hansen. 2012. Parallax Error in the Monocular Head-mounted Eye Trackers. In Proceedings of the 2012 ACM Conference on Ubiquitous Computing (UbiComp '12). ACM, New York, NY, USA, 689--694.
[22]
Darius Miniotas, Oleg Špakov, and I. Scott MacKenzie. 2004. Eye Gaze Interaction with Expanding Targets. In CHI '04 Extended Abstracts on Human Factors in Computing Systems (CHI EA '04). ACM, New York, NY, USA, 1255--1258.
[23]
A. Monden, K. Matsumoto, and M. Yamato. 2005. Evaluation of Gaze-Added Target Selection Methods Suitable for General GUIs. Int. J. Comput. Appl. Technol. 24, 1 (June 2005), 17--24.
[24]
Marcus Nyström, Richard Andersson, Kenneth Holmqvist, and Joost van de Weijer. 2013. The influence of calibration method and eye physiology on eyetracking data quality. Behavior Research Methods 45, 1 (2013), 272--288.
[25]
F. Pedregosa, G. Varoquaux, A. Gramfort, V. Michel, B. Thirion, O. Grisel, M. Blondel, P. Prettenhofer, R. Weiss, V. Dubourg, J. Vanderplas, A. Passos, D. Cournapeau, M. Brucher, M. Perrot, and E. Duchesnay. 2011. Scikit-learn: Machine Learning in Python. Journal of Machine Learning Research 12 (2011), 2825--2830.
[26]
Jeffrey S. Shell, Roel Vertegaal, and Alexander W. Skaburskis. 2003. EyePliances: Attention-seeking Devices That Respond to Visual Attention. In CHI '03 Extended Abstracts on Human Factors in Computing Systems (CHI EA '03). ACM, New York, NY, USA, 770--771.
[27]
Daniel Sonntag. 2015. Kognit: Intelligent Cognitive Enhancement Technology by Cognitive Models and Mixed Reality for Dementia Patients. (2015). https://www.aaai.org/ocs/index.php/FSS/FSS15/paper/view/11702
[28]
Sophie Stellmach and Raimund Dachselt. 2012. Look & Touch: Gaze-supported Target Acquisition. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '12). ACM, New York, NY, USA, 2981--2990.
[29]
Sophie Stellmach, Sebastian Stober, Andreas Nürnberger, and Raimund Dachselt. 2011. Designing Gaze-supported Multimodal Interactions for the Exploration of Large Image Collections. In Proceedings of the 1st Conference on Novel Gaze-Controlled Applications (NGCA '11). ACM, New York, NY, USA, Article 1, 8 pages.
[30]
Takumi Toyama, Thomas Kieninger, Faisal Shafait, and Andreas Dengel. 2012. Gaze Guided Object Recognition Using a Head-mounted Eye Tracker. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA '12). ACM, New York, NY, USA, 91--98.
[31]
Takumi Toyama and Daniel Sonntag. 2015. Towards episodic memory support for dementia patients by recognizing objects, faces and text in eye gaze. KI 2015: Advances in Artificial Intelligence 9324 (2015), 316--323.
[32]
Jayson Turner, Andreas Bulling, Jason Alexander, and Hans Gellersen. 2014. Cross-device Gaze-supported Point-to-point Content Transfer. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA '14). ACM, New York, NY, USA, 19--26.
[33]
Mélodie Vidal, Andreas Bulling, and Hans Gellersen. 2013. Pursuits: Spontaneous Interaction with Displays Based on Smooth Pursuit Eye Movement and Moving Targets. In Proceedings of the 2013 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp '13). ACM, New York, NY, USA, 439--448.
[34]
Oleg Špakov. 2011. Comparison of Gaze-to-objects Mapping Algorithms. In Proceedings of the 1st Conference on Novel Gaze-Controlled Applications (NGCA '11). ACM, New York, NY, USA, Article 6, 8 pages.
[35]
Oleg Špakov. 2012. Comparison of Eye Movement Filters Used in HCI. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA '12). ACM, New York, NY, USA, 281--284.
[36]
Oleg Špakov and Yulia Gizatdinova. 2014. Real-time Hidden Gaze Point Correction. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA '14). ACM, New York, NY, USA, 291--294.
[37]
Lawrence H Yu and E Eizenman. 2004. A new methodology for determining point-of-gaze in head-mounted eye tracking systems. Biomedical Engineering, IEEE Transactions on 51, 10 (Oct 2004), 1765--1773.
[38]
Xinyong Zhang, Xiangshi Ren, and Hongbin Zha. 2008. Improving Eye Cursor's Stability for Eye Pointing Tasks. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '08). ACM, New York, NY, USA, 525--534.
[39]
Yanxia Zhang, Andreas Bulling, and Hans Gellersen. 2013. SideWays: A Gaze Interface for Spontaneous Interaction with Situated Displays. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '13). ACM, New York, NY, USA, 851--860.
[40]
Yunfeng Zhang and Anthony J Hornof. 2011. Mode-of-disparities error correction of eye-tracking data. Behavior research methods 43, 3 (September 2011), 834--842.
[41]
Yunfeng Zhang and Anthony J. Hornof. 2014. Easy Post-hoc Spatial Recalibration of Eye Tracking Data. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA '14). ACM, New York, NY, USA, 95--98.
[42]
Yanxia Zhang, Jörg Müller, Ming Ki Chong, Andreas Bulling, and Hans Gellersen. 2014. GazeHorizon: Enabling Passers-by to Interact with Public Displays by Gaze. In Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp '14). ACM, New York, NY, USA, 559--563.

Cited By

View all
  • (2024)The Effect of Degraded Eye Tracking Accuracy on Interactions in VRProceedings of the 2024 Symposium on Eye Tracking Research and Applications10.1145/3649902.3656369(1-7)Online publication date: 4-Jun-2024
  • (2024)Communication breakdown: Gaze-based prediction of system error for AI-assisted robotic arm simulated in VRProceedings of the 2024 Symposium on Eye Tracking Research and Applications10.1145/3649902.3653339(1-7)Online publication date: 4-Jun-2024
  • (2023)An End-to-End Review of Gaze Estimation and its Interactive Applications on Handheld Mobile DevicesACM Computing Surveys10.1145/360694756:2(1-38)Online publication date: 15-Sep-2023
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
ETRA '18: Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications
June 2018
595 pages
ISBN:9781450357067
DOI:10.1145/3204493
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 14 June 2018

Permissions

Request permissions for this article.

Check for updates

Badges

  • Best Paper

Author Tags

  1. error model
  2. error-aware
  3. eye tracking
  4. gaze interaction
  5. mobile interaction

Qualifiers

  • Research-article

Funding Sources

  • Saarland University, Germany
  • Federal Ministry of Education and Research (BMBF)

Conference

ETRA '18

Acceptance Rates

Overall Acceptance Rate 69 of 137 submissions, 50%

Upcoming Conference

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)58
  • Downloads (Last 6 weeks)4
Reflects downloads up to 26 Sep 2024

Other Metrics

Citations

Cited By

View all
  • (2024)The Effect of Degraded Eye Tracking Accuracy on Interactions in VRProceedings of the 2024 Symposium on Eye Tracking Research and Applications10.1145/3649902.3656369(1-7)Online publication date: 4-Jun-2024
  • (2024)Communication breakdown: Gaze-based prediction of system error for AI-assisted robotic arm simulated in VRProceedings of the 2024 Symposium on Eye Tracking Research and Applications10.1145/3649902.3653339(1-7)Online publication date: 4-Jun-2024
  • (2023)An End-to-End Review of Gaze Estimation and its Interactive Applications on Handheld Mobile DevicesACM Computing Surveys10.1145/360694756:2(1-38)Online publication date: 15-Sep-2023
  • (2023)GE-Simulator: An Open-Source Tool for Simulating Real-Time Errors for HMD-based Eye TrackersProceedings of the 2023 Symposium on Eye Tracking Research and Applications10.1145/3588015.3588417(1-6)Online publication date: 30-May-2023
  • (2023)Interactive Fixation-to-AOI Mapping for Mobile Eye Tracking Data based on Few-Shot Image ClassificationCompanion Proceedings of the 28th International Conference on Intelligent User Interfaces10.1145/3581754.3584179(175-178)Online publication date: 27-Mar-2023
  • (2023)IMETA: An Interactive Mobile Eye Tracking Annotation Method for Semi-automatic Fixation-to-AOI mappingCompanion Proceedings of the 28th International Conference on Intelligent User Interfaces10.1145/3581754.3584125(33-36)Online publication date: 27-Mar-2023
  • (2023)Comparing Dwell time, Pursuits and Gaze Gestures for Gaze Interaction on Handheld Mobile DevicesProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580871(1-17)Online publication date: 19-Apr-2023
  • (2022)Implicit Estimation of Paragraph Relevance From Eye MovementsFrontiers in Computer Science10.3389/fcomp.2021.8085073Online publication date: 7-Jan-2022
  • (2022)Impact of Gaze Uncertainty on AOIs in Information Visualisations2022 Symposium on Eye Tracking Research and Applications10.1145/3517031.3531166(1-6)Online publication date: 8-Jun-2022
  • (2022)Interactive Assessment Tool for Gaze-based Machine Learning Models in Information RetrievalProceedings of the 2022 Conference on Human Information Interaction and Retrieval10.1145/3498366.3505834(332-336)Online publication date: 14-Mar-2022
  • Show More Cited By

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media