Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/2559206.2581163acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
poster

Glasses with haptic feedback of gaze gestures

Published: 26 April 2014 Publication History

Abstract

We introduce eyeglasses that present haptic feedback when using gaze gestures for input. The glasses utilize vibrotactile actuators to provide gentle stimulation to three locations on the user's head. We describe two initial user studies that were conducted to evaluate the easiness of recognizing feedback locations and participants' preferences for combining the feedback with gaze gestures. The results showed that feedback from a single actuator was the easiest to recognize and also preferred when used with gaze gestures. We conclude by presenting future use scenarios that could benefit from gaze gestures and haptic feedback.

Supplementary Material

ZIP File (wip0176-file3.zip)
Zip file containing a PDF of the Accompanying Poster

References

[1]
Borg, E., Rönnberg, J., and Neovius, L. Vibratory-coded directional analysis: evaluation of a three-microphone/four-vibrator dsp system. Journal of Rehabilitation Research and Development 38, 2 (2001), 257--263.
[2]
Drewes, H., De Luca, A., and Schmidt, A. Eye-gaze interaction for mobile phones. In Proc. of Mobility 2007, ACM Press (2007), 364--371.
[3]
Drewes, H. and Schmidt, A. Interacting with the computer using gaze gestures. In Proc. of Interact 2007, Springer (2007), 475--488.
[4]
Hyrskykari, A., Istance, H., and Vickers, S. Gaze gestures or dwell-based interaction. In Proc. of ETRA 2012, ACM Press (2012), 229--232.
[5]
Istance, H., Hyrskykari, A., Immonen, L., Mansikkamaa, S., and Vickers, S. Designing gaze gestures for gaming: an investigation of performance. In Proc. of ETRA 2010, ACM Press (2010), 323--330.
[6]
Kangas, J., Akkil, D., Rantala, J., Isokoski, P., Majaranta, P., and Raisamo, R. Gaze gestures and haptic feedback in mobile devices. In Proc. of CHI 2014 (to appear).
[7]
Lukander, K., Jagadeesan, S., Chi, H., and Müller, K. OMG!: a new robust, wearable and affordable open source mobile gaze tracker. In Proc. of MobileHCI 2013, ACM Press (2013), 408--411.
[8]
Myles, K. and Kalb, J.T. Guidelines for head tactile communication (Report ARL-TR-5116). Aberdeen Proving Ground, MD: Army Research Laboratory (2010).
[9]
Stellmach, S. and Dachselt, R. Look & touch: gaze-supported target acquisition. In Proc. of CHI 2012, ACM Press (2012), 2981--2990.
[10]
Weinstein, S. Intensive and extensive aspects of tactile sensitivity as a function of body part, sex, and laterality. Kenshalo, D. (Ed.), The Skin Senses, Charles C. Thomas, Springfield, IL, USA, 1968.
[11]
Wobbrock, J.O., Rubinstein, J., Sawyer, M.W., and Duchowski, A.T. Longitudinal evaluation of discrete consecutive gaze gestures for text entry. In Proc. of ETRA 2008, ACM Press (2008), 11--18.

Cited By

View all
  • (2024)Designing Haptic Feedback for Sequential Gestural InputsProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642735(1-17)Online publication date: 11-May-2024
  • (2023)Evaluation of precision, accuracy and threshold for the design of vibrotactile feedback in eye tracking applicationsJournal of Sensors and Sensor Systems10.5194/jsss-12-103-202312:1(103-109)Online publication date: 5-Apr-2023
  • (2019)mobEYEleAdjunct Proceedings of the 2019 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2019 ACM International Symposium on Wearable Computers10.1145/3341162.3350842(1113-1119)Online publication date: 9-Sep-2019
  • Show More Cited By

Index Terms

  1. Glasses with haptic feedback of gaze gestures

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    CHI EA '14: CHI '14 Extended Abstracts on Human Factors in Computing Systems
    April 2014
    2620 pages
    ISBN:9781450324748
    DOI:10.1145/2559206
    Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 26 April 2014

    Check for updates

    Author Tags

    1. gaze gestures
    2. gaze input
    3. haptics
    4. vibrotactile feedback
    5. wearable computing

    Qualifiers

    • Poster

    Conference

    CHI '14
    Sponsor:
    CHI '14: CHI Conference on Human Factors in Computing Systems
    April 26 - May 1, 2014
    Ontario, Toronto, Canada

    Acceptance Rates

    CHI EA '14 Paper Acceptance Rate 1,000 of 3,200 submissions, 31%;
    Overall Acceptance Rate 6,164 of 23,696 submissions, 26%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)13
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 03 Oct 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Designing Haptic Feedback for Sequential Gestural InputsProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642735(1-17)Online publication date: 11-May-2024
    • (2023)Evaluation of precision, accuracy and threshold for the design of vibrotactile feedback in eye tracking applicationsJournal of Sensors and Sensor Systems10.5194/jsss-12-103-202312:1(103-109)Online publication date: 5-Apr-2023
    • (2019)mobEYEleAdjunct Proceedings of the 2019 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2019 ACM International Symposium on Wearable Computers10.1145/3341162.3350842(1113-1119)Online publication date: 9-Sep-2019
    • (2019)EyeControlAdjunct Proceedings of the 2019 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2019 ACM International Symposium on Wearable Computers10.1145/3341162.3348384(628-632)Online publication date: 9-Sep-2019
    • (2019)TigerProceedings of the 21st International Conference on Human-Computer Interaction with Mobile Devices and Services10.1145/3338286.3340117(1-11)Online publication date: 1-Oct-2019
    • (2017)Directional cueing of gaze with a vibrotactile headbandProceedings of the 8th Augmented Human International Conference10.1145/3041164.3041176(1-7)Online publication date: 16-Mar-2017
    • (2017)Vibrotactile stimulation of the head enables faster gaze gesturesInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2016.10.00498:C(62-71)Online publication date: 1-Feb-2017
    • (2017)Gaze Cueing with a Vibrotactile Headband for a Visual Search TaskAugmented Human Research10.1007/s41133-017-0008-02:1Online publication date: 28-Jul-2017
    • (2016)What If Devices Take CommandInternational Journal of Handheld Computing Research10.4018/IJHCR.20160401027:2(16-33)Online publication date: Apr-2016
    • (2016)The use of technology to provide physical interaction experiences for cognitively able young people who have complex physical disabilitiesProceedings of the 30th International BCS Human Computer Interaction Conference: Fusion!10.14236/ewic/HCI2016.11(1-6)Online publication date: 11-Jul-2016
    • Show More Cited By

    View Options

    Get Access

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media