Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3379350.3416193acmconferencesArticle/Chapter ViewAbstractPublication PagesuistConference Proceedingsconference-collections
poster

Swipe&Switch: Text Entry Using Gaze Paths and Context Switching

Published: 20 October 2020 Publication History

Abstract

Swipe-based methods for text entry by gaze allow users to swipe through the letters of a word by gaze, analogous to how they can swipe with a finger on a touchscreen keyboard. Two challenges for these methods are: (1) gaze paths do not possess clear start and end positions, and (2) it is difficult to design text editing features. We introduce Swipe&Switch, a text-entry interface that uses swiping and switching to improve gaze-based interaction. The interface contains three context regions, and detects the start/end of a gesture and emits text editing commands (e.g., word insertion, deletion) when a user switches focus between these regions. A user study showed that Swipe&Switch provides a better user experience and higher text entry rate over a baseline, EyeSwipe.

Supplementary Material

MP4 File (3379350.3416193.mp4)
Presentation Video

References

[1]
Antonio Diaz-Tula and Carlos H. Morimoto. Augkey: Increasing foveal throughput in eye typing with augmented keys. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, CHI '16, pages 3533--3544, New York, NY, USA, 2016. ACM.
[2]
Dan Witzner Hansen, Henrik H. T. Skovsgaard, John Paulin Hansen, and Emilie Møllenbach. Noise tolerant selection by gaze-controlled pan and zoom in 3d. In Proceedings of the 2008 Symposium on Eye Tracking Research & Applications, ETRA '08, pages 205--212, New York, NY, USA, 2008. ACM.
[3]
Anke Huckauf and Mario Urbina. Gazing with peye: New concepts in eye typing. In Proceedings of the 4th Symposium on Applied Perception in Graphics and Visualization, APGV '07, pages 141--141, New York, NY, USA, 2007. ACM.
[4]
Poika Isokoski. Text input methods for eye trackers using off-screen targets. In Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, ETRA '00, pages 15--21, New York, NY, USA, 2000. ACM.
[5]
Robert J. K. Jacob. What you look at is what you get: Eye movement-based interaction techniques. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI '90, pages 11--18, New York, NY, USA, 1990. ACM.
[6]
Per Ola Kristensson and Keith Vertanen. The potential of dwell-free eye-typing for fast assistive gaze communication. In Proceedings of the Symposium on Eye Tracking Research and Applications, ETRA '12, pages 241--244, New York, NY, USA, 2012. ACM.
[7]
Per-Ola Kristensson and Shumin Zhai. Shark2: A large vocabulary shorthand writing system for pen-based computers. In Proceedings of the 17th Annual ACM Symposium on User Interface Software and Technology, UIST '04, pages 43--52, New York, NY, USA, 2004. ACM.
[8]
Andrew T. N. Kurauchi, Wenxin Feng, Ajjen Joshi, Carlos H. Morimoto, and Margrit Betke. Eyeswipe: Dwell-free text entry using gaze paths. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, CHI '16, pages 1952--1956, New York, NY, USA, 2016. ACM.
[9]
I. Scott MacKenzie and R. William Soukoreff. Phrase sets for evaluating text entry techniques. In CHI '03 Extended Abstracts on Human Factors in Computing Systems, CHI EA '03, pages 754--755, New York, NY, USA, 2003. ACM.
[10]
Päivi Majaranta, Ulla-Kaija Ahola. Fast gaze typing with an adjustable dwell time. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI '09, pages 357--360, New York, NY, USA, 2009. ACM.
[11]
Päivi Majaranta and Kari-Jouko Räihä. Twenty years of eye typing: Systems and design issues. In Proceedings of the 2002 Symposium on Eye Tracking Research & Applications, ETRA '02, pages 15--22, New York, NY, USA, 2002. ACM.
[12]
Carlos H. Morimoto and Arnon Amir. Context switching for fast key selection in text entry applications. In Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications, ETRA '10, pages 271--274, New York, NY, USA, 2010. ACM.
[13]
Martez E. Mott, Shane Williams, Jacob O. Wobbrock, and Meredith Ringel Morris. Improving dwell-based gaze typing with dynamic, cascading dwell times. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, CHI '17, pages 2558--2570, New York, NY, USA, 2017. ACM.
[14]
Diogo Pedrosa, Maria Da Graça Pimentel, Amy Wright, and Khai N. Truong. Filteryedping: Design challenges and user performance of dwell-free eye typing. ACM Trans. Access. Comput., 6(1):3:1--3:37, March 2015.
[15]
Marco Porta and Matteo Turina. Eye-s: A full-screen input modality for pure eye-based communication. In Proceedings of the 2008 Symposium on Eye Tracking Research & Applications, ETRA '08, pages 27--34, New York, NY, USA, 2008. ACM.
[16]
Sayan Sarcar, Prateek Panwar, and Tuhin Chakraborty. Eyek: An efficient dwell-free eye gaze-based text entry system. In Proceedings of the 11th Asia Pacific Conference on Computer Human Interaction, APCHI '13, pages 215--220, New York, NY, USA, 2013. ACM.
[17]
Darius Miniotas. On-line adjustment of dwell time for target selection by gaze. In Proceedings of the Third Nordic Conference on Human-computer Interaction, NordiCHI '04, pages 203--206, New York, NY, USA, 2004. ACM.
[18]
David J. Ward, Alan F. Blackwell, and David J. C. MacKay. Dashera data entry interface using continuous gestures and language models. In Proceedings of the 13th Annual ACM Symposium on User Interface Software and Technology, UIST '00, pages 129--137, New York, NY, USA, 2000. ACM.
[19]
Jacob O. Wobbrock, James Rubinstein, Michael W. Sawyer, and Andrew T. Duchowski. Longitudinal evaluation of discrete consecutive gaze gestures for text entry. In Proceedings of the 2008 Symposium on Eye Tracking Research & Applications, ETRA '08, pages 11--18, New York, NY, USA, 2008. ACM.
[20]
Wenge Xu, Hai-Ning Liang, Yuxuan Zhao, Tianyu Zhang, Difeng Yu, and Diego Monteiro. Ringtext: Dwell-free and hands-free text entry for mobile head-mounted displays using head motions. IEEE transactions on visualization and computer graphics, 25(5):1991--2001, 2019.

Cited By

View all
  • (2024)40 Years of Eye Typing: Challenges, Gaps, and Emergent StrategiesProceedings of the ACM on Human-Computer Interaction10.1145/36555968:ETRA(1-19)Online publication date: 28-May-2024
  • (2024)The Guided Evaluation MethodInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2024.103317190:COnline publication date: 1-Oct-2024
  • (2023)Does One Keyboard Fit All? Comparison and Evaluation of Device-Free Augmented Reality Keyboard DesignsProceedings of the 29th ACM Symposium on Virtual Reality Software and Technology10.1145/3611659.3615692(1-11)Online publication date: 9-Oct-2023
  • Show More Cited By

Index Terms

  1. Swipe&Switch: Text Entry Using Gaze Paths and Context Switching

      Recommendations

      Comments

      Please enable JavaScript to view thecomments powered by Disqus.

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      UIST '20 Adjunct: Adjunct Proceedings of the 33rd Annual ACM Symposium on User Interface Software and Technology
      October 2020
      203 pages
      ISBN:9781450375153
      DOI:10.1145/3379350
      Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 20 October 2020

      Check for updates

      Author Tags

      1. eye tracking
      2. eye typing
      3. gaze swiping
      4. gesture-based typing
      5. text entry

      Qualifiers

      • Poster

      Funding Sources

      • São Paulo Research Foundation (FAPESP)

      Conference

      UIST '20

      Acceptance Rates

      Overall Acceptance Rate 561 of 2,567 submissions, 22%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)46
      • Downloads (Last 6 weeks)9
      Reflects downloads up to 19 Nov 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)40 Years of Eye Typing: Challenges, Gaps, and Emergent StrategiesProceedings of the ACM on Human-Computer Interaction10.1145/36555968:ETRA(1-19)Online publication date: 28-May-2024
      • (2024)The Guided Evaluation MethodInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2024.103317190:COnline publication date: 1-Oct-2024
      • (2023)Does One Keyboard Fit All? Comparison and Evaluation of Device-Free Augmented Reality Keyboard DesignsProceedings of the 29th ACM Symposium on Virtual Reality Software and Technology10.1145/3611659.3615692(1-11)Online publication date: 9-Oct-2023
      • (2023)GlanceWriter: Writing Text by Glancing Over Letters with GazeProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581269(1-13)Online publication date: 19-Apr-2023
      • (2022)Online eye-movement classification with temporal convolutional networksBehavior Research Methods10.3758/s13428-022-01978-255:7(3602-3620)Online publication date: 11-Oct-2022
      • (2022)Performance Analysis of Saccades for Primary and Confirmatory Target SelectionProceedings of the 28th ACM Symposium on Virtual Reality Software and Technology10.1145/3562939.3565619(1-12)Online publication date: 29-Nov-2022

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media