Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/2628363.2628370acmconferencesArticle/Chapter ViewAbstractPublication PagesmobilehciConference Proceedingsconference-collections
short-paper

Probabilistic touchscreen keyboard incorporating gaze point information

Published: 23 September 2014 Publication History

Abstract

We propose a novel probabilistic keyboard that takes into account the distance between a gaze point and a touch position in order to improve typing efficiency. The proposed keyboard dynamically changes the size of the search space for predicting candidate words based on a model that estimates the magnitude of touch position errors according to the distance between the gaze point and the touch position. This makes it possible for users to type intended words even when they glance at different areas on the screen. Performance was evaluated in terms of input accuracy in total error rate (TER) and of typing speed in words per minute (WPM). The results showed that the proposed keyboard successfully reduced the TER by 18.2% and increased WPM by 12.7% compared to the conventional keyboard.

References

[1]
Arif, A, S., and Stuerzlinger. Pseudo-pressure detection and its use in productive text entry on touchscreens. In Proc. OzCHI 2013, ACM (2013), 382--392.
[2]
Arif, A, S., and Stuerzlinger, W. Analysis of Text Entry Performance Metrics. In Proc. TIC-STH 2009, IEEE (2009), 100--105.
[3]
Bieg, H, J., Reiterer, H., and Bulthoff, H, H. Eye and pointer coordination in search and selection tasks. In Proc. ETRA 2010, ACM (2010), 89--92.
[4]
Goel, M., Findlater, L., and Wobbrock, J, O. WalkType: using accelerometer data to accomodate situational impairments in mobile touch screen text entry. In Proc. CHI 2012, ACM (2012), 2687--2696.
[5]
Goodman, J., Venolia, G., Steury, K., and Parker, C. Language Modeling for Soft Keyboards. In Proc. IUI 2002, ACM (2002), 194--195.
[6]
Hagiya, T., and Kato, T. Probabilistic Keyboard Adaptable to User and Operating Style Based on Syllable HMMs. In Proc. ICPR 2012, IEEE (2012), 65--68.
[7]
Hoggan, E., Brewster, S. A., and Johnston, J. Investigating the effectiveness of tactile feedback for mobile touchscreens. In Proc. CHI 2008, ACM (2008), 1573--1582.
[8]
Kucera, H., and Francis, W, N. Pattern Recognition And Machine Learning. Brown University Press, 1967.
[9]
Kumar, M., Paepcke, A., and Winograd, T. EyePoint: Practical Pointing and Selection Using Gaze and Keyboard . In Proc. CHI 2007, ACM (2007), 421--430.
[10]
MacKenzie, I, S., and Soukoreff, R, W. Phrase sets for evaluating text entry techniques. In Proc. CHI 2003, ACM (2003), 754--755.
[11]
OMRON SOFTWARE Co., Ltd. http://www.omronsoft.com/.
[12]
Qualcomm. https://developer.qualcomm.com/.
[13]
Soukoreff, R. W., and MacKenzie, I, S. Recent Developments in Text-Entry Error Rate Measurement. In Proc. CHI 2004, ACM (2004), 1425--1428.
[14]
Stellmach, S., and Dachselt, R. Still Looking: Investigating Seamless Gaze-supported Selection, Positioning, and Manipulation of Distant Targets. In Proc. CHI 2013, ACM (2013), 285--294.
[15]
Tobii Technology. http://www.tobii.com/.
[16]
Walker, H, K., Hall, W, D., and Hurst, J, W. Clinical Methods: The History, Physical, and Laboratory Examinations. Boston, Butterworths, 1990.
[17]
Yin, Y., Ouyang, T, Y., Partridge, K., and Zhai, S. Making Touchscreen Keyboards Adaptive to Keys, Hand Postures, and Individuals - A Hierarchical Spatial Backoff Model Approach. In Proc. CHI 2013, ACM (2013), 2775--2784.
[18]
Zhai, S., Morimoto, C., and Ihde, S. Manual and gaze input cascaded (MAGIC) pointing. In Proc. CHI 1999, ACM (1999), 246--253.

Index Terms

  1. Probabilistic touchscreen keyboard incorporating gaze point information

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    MobileHCI '14: Proceedings of the 16th international conference on Human-computer interaction with mobile devices & services
    September 2014
    664 pages
    ISBN:9781450330046
    DOI:10.1145/2628363
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 23 September 2014

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. eye tracking
    2. gaze-supported interaction
    3. mobile devices
    4. multi-modal interface
    5. touchscreen text input

    Qualifiers

    • Short-paper

    Conference

    MobileHCI '14
    Sponsor:

    Acceptance Rates

    MobileHCI '14 Paper Acceptance Rate 35 of 124 submissions, 28%;
    Overall Acceptance Rate 202 of 906 submissions, 22%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 246
      Total Downloads
    • Downloads (Last 12 months)5
    • Downloads (Last 6 weeks)1
    Reflects downloads up to 13 Feb 2025

    Other Metrics

    Citations

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media