Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3517031.3529231acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
research-article

Usability of the super-vowel for gaze-based text entry

Published: 08 June 2022 Publication History

Abstract

We tested experimentally the idea of reducing the number of buttons in the gaze-based text entry system by replacing all vowels with a single diamond character, which we call super-vowel. It is inspired by historical optimizations of the written language, like Abjar. This way, the number of items on the screen was reduced, simplifying text input and allowing to make the buttons larger. However, the modification can also be a distractor that increases the number of errors. As a result of an experiment on 29 people, it turned out that in the case of non-standard methods of entering text, the modification slightly increases the speed of entering the text and reduces the number of errors. However, this does not apply to the standard keyboard, a direct transformation of physical computer keyboards with a Qwerty button layout.

References

[1]
Aaron Bangor, Philip T. Kortum, and James T. Miller. 2008. An Empirical Evaluation of the System Usability Scale, International Journal of Human–Computer Interaction, 24:6, 574-594.
[2]
Nikolaus Bee and Elisabeth André. 2008. Writing with your eye: A dwell time free writing system adapted to the nature of human eye gaze. In International Tutorial and Research Workshop on Perception and Interactive Technologies for Speech-Based Systems (pp. 111–122). Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-69369-7_13
[3]
Simone Borsci, Stefano Federici, and Marco Lauriola. 2009. On the dimensionality of the System Usability Scale: a test of alternative measurement models. Cognitive processing, 10(3), 193–197. https://doi.org/10.1007/s10339-009-0268-9
[4]
John Brooke. 1996. SUS: a quick and dirty usability. In Patrick W. Jordan, Bruce Thomas, Ian L. McClelland, and Benard A. Weerdmeester, (Eds.). (1996). Usability Evaluation In Industry (1st ed.). CRC Press. https://doi.org/10.1201/9781498710411
[5]
Heiko Drewes, Ken Pfeuffer, and Florian Alt. 2019. Time-and space-efficient eye tracker calibration. In Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications (pp. 1-8).
[6]
John Paulin Hansen, Kristian Torning, Anders Sewerin Johansen, and Kenji Itoh. 2004. Gaze typing compared with input by head and hand. In Proceedings of the 2004 symposium on Eye tracking research & applications (pp. 131-138).
[7]
Dan Witzner Hansen, Henrik H. T. Skovsgaard, John Paulin Hansen, and Emilie Møllenbach. 2008. Noise tolerant selection by gaze-controlled pan and zoom in 3D. In Proceedings of the 2008 symposium on Eye tracking research & applications (pp. 205–212). https://doi.org/10.1145/1344471.1344521
[8]
Anke Huckauf and Mario H. Urbina. 2008. Gazing with pEYEs: towards a universal input for various applications. In Proceedings of the 2008 symposium on Eye tracking research & applications (pp. 51–54). https://doi.org/10.1145/1344471.1344483
[9]
Christoph Klein and Ulrich Ettinger. 2019. Eye Movement Research: An Introduction to Its Scientific Foundations and Applications. Springer. https://doi.org/10.1007/978-3-030-20085-5
[10]
James R. Lewis and Jeff Sauro. 2009. The Factor Structure of the System Usability Scale. In: Kurosu M. (eds) Human Centered Design. HCD 2009. Lecture Notes in Computer Science, vol 5619. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-02806-9_12
[11]
Otto Hans-Martin Lutz, Antje Christine Venjakob, and Stefan Ruff. 2015. SMOOVS: Towards calibration-free text entry by gaze using smooth pursuit movements. Journal of Eye Movement Research, 8(1). https://doi.org/10.16910/jemr.8.1.2
[12]
Päivi Majaranta and Kari-Jouko Räihä. 2002. Twenty years of eye typing: systems and design issues. In Proceedings of the 2002 symposium on Eye tracking research & applications (pp. 15–22). https://doi.org/10.1145/507072.507076
[13]
Päivi Majaranta. 2012. Communication and text entry by gaze. In Gaze interaction and applications of eye tracking: Advances in assistive technologies (pp. 63-77). IGI Global.
[14]
Jacek Matulewski and Mateusz Patera. 2020. Comparison of three dwell-time-based gaze text entry methods. In ACM Symposium on Eye Tracking Research and Applications (pp. 1-5). https://doi.org/10.1145/3379157.3388931
[15]
Mohsen Parisay, Charalambos Poullis, and Marta Kersten-Oertel, 2020. FELiX: Fixation-based Eye Fatigue Load Index A Multi-factor Measure for Gaze-based Interactions, 2020 13th International Conference on Human System Interaction (HSI), 2020, pp. 74-81.
[16]
Ken Perlin. 1998. Quikwriting: continuous stylus-based text entry. In Proceedings of the 11th annual ACM symposium on User interface software and technology (pp. 215–216). https://doi.org/10.1145/288392.288613
[17]
Marco Porta and Matteo Turina. 2008. Eye-S: a full-screen input modality for pure eye-based communication. In Proceedings of the 2008 symposium on Eye tracking research & applications (pp. 27–34). https://doi.org/10.1145/1344471.1344477
[18]
Jeff Sauro and James R. Lewis. 2016. Quantifying the user experience: Practical statistics for user research. Morgan Kaufmann. https://www.elsevier.com/books/quantifying-the-user-experience/sauro/978-0-12-802308-2
[19]
David J. Ward and David J. MacKay. 2002. Fast hands-free writing by gaze direction. Nature, 418(6900), 838–838. https://doi.org/10.1038/418838a
[20]
Jacob O. Wobbrock, James Rubinstein, Michael W. Sawyer, and Andrew T. Duchowski. 2008. Longitudinal evaluation of discrete consecutive gaze gestures for text entry. In Proceedings of the 2008 symposium on Eye tracking research & applications (pp. 11–18). https://doi.org/10.1145/1344471.1344475

Cited By

View all

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
ETRA '22: 2022 Symposium on Eye Tracking Research and Applications
June 2022
408 pages
ISBN:9781450392525
DOI:10.1145/3517031
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 08 June 2022

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Eye-tracking
  2. Gaze interaction
  3. Gaze-based text entry
  4. Human-Computer interfaces

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Funding Sources

  • Academic Business Incubator of Nicolaus Copernicus University

Conference

ETRA '22

Acceptance Rates

ETRA '22 Paper Acceptance Rate 15 of 39 submissions, 38%;
Overall Acceptance Rate 69 of 137 submissions, 50%

Upcoming Conference

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)9
  • Downloads (Last 6 weeks)0
Reflects downloads up to 09 Dec 2024

Other Metrics

Citations

Cited By

View all

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media