Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3411763.3451703acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
poster

GazeBar: Exploiting the Midas Touch in Gaze Interaction

Published: 08 May 2021 Publication History

Abstract

Imagine an application that requires constant configuration changes, such as modifying the brush type in a drawing application. Typically, options are hierarchically organized in menu bars that the user must navigate, sometimes through several levels, to select the desired mode. An alternative to reduce hand motion is the use of multimodal techniques such as gaze-touch, that combines gaze pointing with mechanical selection. In this paper, we introduce GazeBar, a novel multimodal gaze interaction technique that uses gaze paths as a combined pointing and selection mechanism. The idea behind GazeBar is to maximize the interaction flow by reducing ”safety” mechanisms (such as clicking) under certain circumstances. We present GazeBar’s design and demonstrate it using a digital drawing application prototype. Advantages and disadvantages of GazeBar are discussed based on a user performance model.

Supplemental Material

MP4 File
Supplemental video

References

[1]
Richard A Abrams, David E Meyer, and Sylvan Kornblum. 1989. Speed and accuracy of saccadic eye movements: characteristics of impulse variability in the oculomotor system.Journal of Experimental Psychology: Human Perception and Performance 15, 3(1989), 529.
[2]
A. Terry Bahill, Michael R. Clark, and Lawrence Stark. 1975. The main sequence, a tool for studying human eye movements. Mathematical biosciences 24, 3-4 (1975), 191–204.
[3]
Michael Barz, Florian Daiber, Daniel Sonntag, and Andreas Bulling. 2018. Error-aware gaze-based interfaces for robust mobile gaze interaction. In Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications, ETRA 2018, Warsaw, Poland, June 14-17, 2018, Bonita Sharifand Krzysztof Krejtz (Eds.). ACM, 24:1–24:10. https://doi.org/10.1145/3204493.3204536
[4]
Chris Creed, Maite Frutos Pascual, and Ian Williams. 2020. Multimodal Gaze Interaction for Creative Design. In CHI ’20: CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, April 25-30, 2020, Regina Bernhaupt, Florian ’Floyd’ Mueller, David Verweij, Josh Andres, Joanna McGrenere, Andy Cockburn, Ignacio Avellino, Alix Goguey, Pernille Bjøn, Shengdong Zhao, Briane Paul Samson, and Rafal Kocielnik (Eds.). ACM, 1–13. https://doi.org/10.1145/3313831.3376196
[5]
Antonio Diaz-Tula and Carlos H. Morimoto. 2016. AugKey: Increasing Foveal Throughput in Eye Typing with Augmented Keys. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (San Jose, California, USA) (CHI ’16). ACM, New York, NY, USA, 3533–3544. https://doi.org/10.1145/2858036.2858517
[6]
Heiko Drewes and Albrecht Schmidt. 2009. The MAGIC Touch: Combining MAGIC-Pointing with a Touch-Sensitive Mouse. 415–428. https://doi.org/10.1007/978-3-642-03658-3_46
[7]
Augusto Esteves, Eduardo Velloso, Andreas Bulling, and Hans Gellersen. 2015. Orbits: enabling gaze interaction in smart watches using moving targets. In Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2015 ACM International Symposium on Wearable Computers, UbiComp/ISWC Adjunct 2015, Osaka, Japan, September 7-11, 2015, Kenji Mase, Marc Langheinrich, Daniel Gatica-Perez, Hans Gellersen, Tanzeem Choudhury, and Koji Yatani (Eds.). ACM, 419–422. https://doi.org/10.1145/2800835.2800942
[8]
Wenxin Feng, Ming Chen, and Margrit Betke. 2014. Target reverse crossing: a selection method for camera-based mouse-replacement systems. In Proceedings of the 7th International Conference on PErvasive Technologies Related to Assistive Environments, PETRA 2014, Island of Rhodes, Greece, May 27 - 30, 2014, Fillia Makedon, Mark Clements, Catherine Pelachaud, Vana Kalogeraki, and Ilias Maglogiannis (Eds.). ACM, 39:1–39:4. https://doi.org/10.1145/2674396.2674443
[9]
[9] Krita Foundation.2020. http://krita.org/
[10]
John Paulin Hansen, Vijay Rajanna, I. Scott MacKenzie, and Per Bækgaard. 2018. A Fitts’ law study of click and dwell interaction by gaze, head and mouse with a head-mounted display. In Proceedings of the Workshop on Communication by Gaze Interaction, COGAIN@ETRA 2018, Warsaw, Poland, June 15, 2018, Carlos Morimoto and Thies Pfeiffer (Eds.). ACM, 7:1–7:5. https://doi.org/10.1145/3206343.3206344
[11]
Anke Huckauf and Mario Urbina. 2007. Gazing with pEYE: New Concepts in Eye Typing. In Proceedings of the 4th Symposium on Applied Perception in Graphics and Visualization (Tubingen, Germany) (APGV ’07). ACM, New York, NY, USA, 141–141. https://doi.org/10.1145/1272582.1272618
[12]
Robert J. K. Jacob. 1991. The Use of Eye Movements in Human-Computer Interaction Techniques: What You Look At is What You Get. ACM Trans. Inf. Syst. 9, 2 (1991), 152–169. https://doi.org/10.1145/123078.128728
[13]
Marcel Adam Just and Patricia A Carpenter. 1976. The role of eye-fixation research in cognitive psychology. Behavior Research Methods & Instrumentation 8, 2(1976), 139–143.
[14]
Mohamed Khamis, Ozan Saltuk, Alina Hang, Katharina Stolz, Andreas Bulling, and Florian Alt. 2016. TextPursuits: Using Text for Pursuits-based Interaction and Calibration on Public Displays. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing (Heidelberg, Germany) (UbiComp ’16). ACM, New York, NY, USA, 274–285. https://doi.org/10.1145/2971648.2971679
[15]
Per Ola Kristensson and Keith Vertanen. 2012. The Potential of Dwell-free Eye-typing for Fast Assistive Gaze Communication. In Proceedings of the Symposium on Eye Tracking Research and Applications (Santa Barbara, California) (ETRA ’12). ACM, New York, NY, USA, 241–244. https://doi.org/10.1145/2168556.2168605
[16]
Chandan Kumar, Daniyal Akbari, Raphael Menges, I. Scott MacKenzie, and Steffen Staab. 2019. TouchGazePath: Multimodal Interaction with Touch and Gaze Path for Secure Yet Efficient PIN Entry. In International Conference on Multimodal Interaction, ICMI 2019, Suzhou, China, October 14-18, 2019, Wen Gao, Helen Mei-Ling Meng, Matthew A. Turk, Susan R. Fussell, Björn W. Schuller, Yale Song, and Kai Yu(Eds.). ACM, 329–338. https://doi.org/10.1145/3340555.3353734
[17]
Chandan Kumar, Ramin Hedeshy, I. Scott MacKenzie, and Steffen Staab. 2020. TAGSwipe: Touch Assisted Gaze Swipe for Text Entry. In CHI ’20: CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, April 25-30, 2020, Regina Bernhaupt, Florian ’Floyd’ Mueller, David Verweij, Josh Andres, Joanna McGrenere, Andy Cockburn, Ignacio Avellino, Alix Goguey, Pernille Bjøn, Shengdong Zhao, Briane Paul Samson, and Rafal Kocielnik (Eds.). ACM, 1–12. https://doi.org/10.1145/3313831.3376317
[18]
Andrew T. N. Kurauchi, Wenxin Feng, Ajjen Joshi, Carlos H. Morimoto, and Margrit Betke. 2016. EyeSwipe: Dwell-free Text Entry Using Gaze Paths. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (Santa Clara, California, USA) (CHI ’16). ACM, New York, NY, USA, 1952–1956. https://doi.org/10.1145/2858036.2858335
[19]
Päivi Majaranta, Ulla-Kaija Ahola, and Oleg Špakov. 2009. Fast Gaze Typing with an Adjustable Dwell Time. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Boston, MA, USA) (CHI ’09). ACM, New York, NY, USA, 357–360. https://doi.org/10.1145/1518701.1518758
[20]
Päivi Majaranta and Kari-Jouko Räihä. 2002. Twenty Years of Eye Typing: Systems and Design Issues. In Proceedings of the 2002 Symposium on Eye Tracking Research & Applications (New Orleans, Louisiana) (ETRA ’02). ACM, New York, NY, USA, 15–22. https://doi.org/10.1145/507072.507076
[21]
Darius Miniotas. 2000. Application of Fitts’ law to eye gaze interaction. In CHI ’00 Extended Abstracts on Human Factors in Computing Systems, CHI Extended Abstracts ’00, The Hague, The Netherlands, April 1-6, 2000, Marilyn Tremaine(Ed.). ACM, 339–340. https://doi.org/10.1145/633292.633496
[22]
Emilie Møllenbach, Martin Lillholm, Alastair G. Gail, and John Paulin Hansen. 2010. Single gaze gestures. In Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications, ETRA 2010, Austin, Texas, USA, March 22-24, 2010, Carlos Hitoshi Morimoto, Howell O. Istance, Aulikki Hyrskykari, and Qiang Ji (Eds.). ACM, 177–180. https://doi.org/10.1145/1743666.1743710
[23]
Carlos Hitoshi Morimoto and Arnon Amir. 2010. Context switching for fast key selection in text entry applications. In Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications, ETRA 2010, Austin, Texas, USA, March 22-24, 2010, Carlos Hitoshi Morimoto, Howell O. Istance, Aulikki Hyrskykari, and Qiang Ji (Eds.). ACM, 271–274. https://doi.org/10.1145/1743666.1743730
[24]
Carlos H. Morimoto and Marcio R.M. Mimica. 2005. Eye gaze tracking techniques for interactive applications. Computer Vision and Image Understanding 98, 1 (2005), 4 – 24. https://doi.org/10.1016/j.cviu.2004.07.010 Special Issue on Eye Detection and Tracking.
[25]
Martez E. Mott, Shane Williams, Jacob O. Wobbrock, and Meredith Ringel Morris. 2017. Improving Dwell-Based Gaze Typing with Dynamic, Cascading Dwell Times. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (Denver, Colorado, USA) (CHI ’17). ACM, New York, NY, USA, 2558–2570. https://doi.org/10.1145/3025453.3025517
[26]
Emilie Møllenbach, John Paulin Hansen, and Martin Lillholm. 2013. Eye Movements in Gaze Interaction. Journal of Eye Movement Research 6, 2 (2013). https://bop.unibe.ch/index.php/JEMR/article/view/2354
[27]
J. Nakamura and M. Csikszentmihalyi. 2014. Flow and the Foundations of Positive Psychology. Springer, Chapter The Concept of Flow. https://doi.org/10.1007/978-94-017-9088-8_16
[28]
Diogo Pedrosa, Maria Da Graça Pimentel, Amy Wright, and Khai N. Truong. 2015. Filteryedping: Design Challenges and User Performance of Dwell-Free Eye Typing. ACM Trans. Access. Comput. 6, 1, Article 3 (March 2015), 37 pages. https://doi.org/10.1145/2724728
[29]
Ken Pfeuffer, Jason Alexander, Ming Ki Chong, and Hans Gellersen. 2014. Gaze-touch: combining gaze with multi-touch for interaction on the same surface. In The 27th Annual ACM Symposium on User Interface Software and Technology, UIST ’14, Honolulu, HI, USA, October 5-8, 2014, Hrvoje Benko, Mira Dontcheva, and Daniel Wigdor (Eds.). ACM, 509–518. https://doi.org/10.1145/2642918.2647397
[30]
Ken Pfeuffer, Jason Alexander, Ming Ki Chong, Yanxia Zhang, and Hans Gellersen. 2015. Gaze-Shifting: Direct-Indirect Input with Pen and Touch Modulated by Gaze. In Proceedings of the 28th Annual ACM Symposium on User Interface Software and Technology(Charlotte, NC, USA) (UIST ’15). Association for Computing Machinery, New York, NY, USA, 373–383. https://doi.org/10.1145/2807442.2807460
[31]
Ken Pfeuffer, Jason Alexander, and Hans Gellersen. 2016. Partially-indirect Bimanual Input with Gaze, Pen, and Touch for Pan, Zoom, and Ink Interaction. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, San Jose, CA, USA, May 7-12, 2016, Jofish Kaye, Allison Druin, Cliff Lampe, Dan Morris, and Juan Pablo Hourcade (Eds.). ACM, 2845–2856. https://doi.org/10.1145/2858036.2858201
[32]
Ken Pfeuffer and Hans Gellersen. 2016. Gaze and Touch Interaction on Tablets. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology, UIST 2016, Tokyo, Japan, October 16-19, 2016, Jun Rekimoto, Takeo Igarashi, Jacob O. Wobbrock, and Daniel Avrahami (Eds.). ACM, 301–311. https://doi.org/10.1145/2984511.2984514
[33]
Sheikh Rivu, Yasmeen Abdrabou, Thomas Mayer, Ken Pfeuffer, and Florian Alt. 2019. GazeButton: enhancing buttons with eye gaze interactions. In Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications, ETRA 2019, Denver, CO, USA, June 25-28, 2019, Krzysztof Krejtz and Bonita Sharif (Eds.). ACM, 73:1–73:7. https://doi.org/10.1145/3317956.3318154
[34]
Immo Schuetz, T. Scott Murdison, Kevin J. MacKenzie, and Marina Zannoli. 2019. An Explanation of Fitts’ Law-like Performance in Gaze-Based Selection Tasks Using a Psychophysics Approach. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, CHI 2019, Glasgow, Scotland, UK, May 04-09, 2019, Stephen A. Brewster, Geraldine Fitzpatrick, Anna L. Cox, and Vassilis Kostakos (Eds.). ACM, 535. https://doi.org/10.1145/3290605.3300765
[35]
Linda E. Sibert and Robert J. K. Jacob. 2000. Evaluation of eye gaze interaction. In Proceedings of the CHI 2000 Conference on Human factors in computing systems, The Hague, The Netherlands, April 1-6, 2000, Thea Turner and Gerd Szwillus (Eds.). ACM, 281–288. https://doi.org/10.1145/332040.332445
[36]
Sophie Stellmach and Raimund Dachselt. 2012. Look & touch: gaze-supported target acquisition. In CHI Conference on Human Factors in Computing Systems, CHI ’12, Austin, TX, USA - May 05 - 10, 2012, Joseph A. Konstan, Ed H. Chi, and Kristina Höök (Eds.). ACM, 2981–2990. https://doi.org/10.1145/2207676.2208709
[37]
Antonio Diaz Tula, Filipe Morgado Simoes de Campos, and Carlos H. Morimoto. 2012. Dynamic context switching for gaze based interaction. In Proceedings of the 2012 Symposium on Eye-Tracking Research and Applications, ETRA 2012, Santa Barbara, CA, USA, March 28-30, 2012, Carlos Hitoshi Morimoto, Howell O. Istance, Stephen N. Spencer, Jeffrey B. Mulligan, and Pernilla Qvarfordt (Eds.). ACM, 353–356. https://doi.org/10.1145/2168556.2168635
[38]
Jayson Turner, Eduardo Velloso, Hans Gellersen, and Veronica Sundstedt. 2014. EyePlay: Applications for Gaze in Games. In Proceedings of the First ACM SIGCHI Annual Symposium on Computer-Human Interaction in Play (Toronto, Ontario, Canada) (CHI PLAY ’14). Association for Computing Machinery, New York, NY, USA, 465–468. https://doi.org/10.1145/2658537.2659016
[39]
Mario H. Urbina and Anke Huckauf. 2010. Alternatives to single character entry and dwell time selection on eye typing. In Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications, ETRA 2010, Austin, Texas, USA, March 22-24, 2010, Carlos Hitoshi Morimoto, Howell O. Istance, Aulikki Hyrskykari, and Qiang Ji (Eds.). ACM, 315–322. https://doi.org/10.1145/1743666.1743738
[40]
Eduardo Velloso, Marcus Carter, Joshua Newn, Augusto Esteves, Christopher Clarke, and Hans Gellersen. 2017. Motion Correlation: Selecting Objects by Matching Their Movement. ACM Trans. Comput. Hum. Interact. 24, 3 (2017), 22:1–22:35. https://doi.org/10.1145/3064937
[41]
Roel Vertegaal. 2008. A Fitts Law comparison of eye tracking and manual input in the selection of visual targets. In Proceedings of the 10th International Conference on Multimodal Interfaces, ICMI 2008, Chania, Crete, Greece, October 20-22, 2008, Vassilios Digalakis, Alexandros Potamianos, Matthew A. Turk, Roberto Pieraccini, and Yuri Ivanov(Eds.). ACM, 241–248. https://doi.org/10.1145/1452392.1452443
[42]
Mélodie Vidal, Andreas Bulling, and Hans Gellersen. 2013. Pursuits: Spontaneous Interaction with Displays Based on Smooth Pursuit Eye Movement and Moving Targets. In Proceedings of the 2013 ACM International Joint Conference on Pervasive and Ubiquitous Computing (Zurich, Switzerland) (UbiComp ’13). ACM, New York, NY, USA, 439–448. https://doi.org/10.1145/2493432.2493477
[43]
Oleg Špakov and Darius Miniotas. 2004. On-line Adjustment of Dwell Time for Target Selection by Gaze. In Proceedings of the Third Nordic Conference on Human-computer Interaction (Tampere, Finland) (NordiCHI ’04). ACM, New York, NY, USA, 203–206. https://doi.org/10.1145/1028014.1028045
[44]
David J. Ward and David J. C. MacKay. 2002. Fast Hands-free writing by Gaze Direction. Nature 418, 6900 (2002), 838. http://www.inference.phy.cam.ac.uk/mackay/abstracts/eyeshortpaper.html
[45]
Thorsten O. Zander, Matti Gaertner, Christian Kothe, and Roman Vilimek. 2011. Combining Eye Gaze Input With a Brain-Computer Interface for Touchless Human-Computer Interaction. Int. J. Hum. Comput. Interact. 27, 1 (2011), 38–51. https://doi.org/10.1080/10447318.2011.535752
[46]
Shumin Zhai, Carlos Morimoto, and Steven Ihde. 1999. Manual and Gaze Input Cascaded (MAGIC) Pointing. In Proceeding of the CHI ’99 Conference on Human Factors in Computing Systems: The CHI is the Limit, Pittsburgh, PA, USA, May 15-20, 1999, Marian G. Williamsand Mark W. Altom (Eds.). ACM, 246–253. https://doi.org/10.1145/302979.303053

Cited By

View all
  • (2024)A real-time camera-based gaze-tracking system involving dual interactive modes and its application in gamingMultimedia Systems10.1007/s00530-023-01204-930:1Online publication date: 16-Jan-2024
  • (2023)Gaze-based Mode-Switching to Enhance Interaction with Menus on TabletsProceedings of the 2023 Symposium on Eye Tracking Research and Applications10.1145/3588015.3588409(1-8)Online publication date: 30-May-2023
  • (2023)Gaze & Tongue: A Subtle, Hands-Free Interaction for Head-Worn DevicesExtended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544549.3583930(1-4)Online publication date: 19-Apr-2023
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
CHI EA '21: Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems
May 2021
2965 pages
ISBN:9781450380959
DOI:10.1145/3411763
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 08 May 2021

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. gaze interaction
  2. interface modes
  3. multimodal interaction

Qualifiers

  • Poster
  • Research
  • Refereed limited

Funding Sources

  • Sao Paulo Research Foundation

Conference

CHI '21
Sponsor:

Acceptance Rates

Overall Acceptance Rate 6,164 of 23,696 submissions, 26%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)98
  • Downloads (Last 6 weeks)13
Reflects downloads up to 26 Sep 2024

Other Metrics

Citations

Cited By

View all
  • (2024)A real-time camera-based gaze-tracking system involving dual interactive modes and its application in gamingMultimedia Systems10.1007/s00530-023-01204-930:1Online publication date: 16-Jan-2024
  • (2023)Gaze-based Mode-Switching to Enhance Interaction with Menus on TabletsProceedings of the 2023 Symposium on Eye Tracking Research and Applications10.1145/3588015.3588409(1-8)Online publication date: 30-May-2023
  • (2023)Gaze & Tongue: A Subtle, Hands-Free Interaction for Head-Worn DevicesExtended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544549.3583930(1-4)Online publication date: 19-Apr-2023
  • (2023)Affordance-Guided User Elicitation of Interaction Concepts for Unimodal Gaze Control of Potential Holographic 3D UIs in Automotive Applications2023 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)10.1109/ISMAR-Adjunct60411.2023.00011(14-19)Online publication date: 16-Oct-2023
  • (2023)DVGaze: Dual-View Gaze Estimation2023 IEEE/CVF International Conference on Computer Vision (ICCV)10.1109/ICCV51070.2023.01886(20575-20584)Online publication date: 1-Oct-2023
  • (2022)GazeDock: Gaze-Only Menu Selection in Virtual Reality using Auto-Triggering Peripheral Menu2022 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)10.1109/VR51125.2022.00105(832-842)Online publication date: Mar-2022
  • (2022)Glance-Box: Multi-LOD Glanceable Interfaces for Machine Shop Guidance in Augmented Reality using Blink and Hand Interaction2022 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)10.1109/ISMAR-Adjunct57072.2022.00070(315-321)Online publication date: Oct-2022

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media