Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3242969.3242987acmotherconferencesArticle/Chapter ViewAbstractPublication Pagesicmi-mlmiConference Proceedingsconference-collections
short-paper

Gazeover -- Exploring the UX of Gaze-triggered Affordance Communication for GUI Elements

Published: 02 October 2018 Publication History

Abstract

The user experience (UX) of graphical user interfaces (GUIs) often depends on how clearly visual designs communicate/signify "affordances", such as if an element on the screen can be pushed, dragged, or rotated. Especially for novice users figuring out the complexity of a new interface can be cumbersome. In the "past" era of mouse-based interaction mouseover effects were successfully utilized to trigger a variety of assistance, and help users in exploring interface elements without causing unintended interactions and associated negative experiences. Today's GUIs are increasingly designed for touch and lack a method similiar to mouseover to help (novice) users to get acquainted with interface elements. In order to address this issue, we have studied gazeover, as a technique for triggering "help or guidance" when a user's gaze is over an interactive element, which we believe is suitable for today's touch interfaces. We report on a user study comparing pragmatic and hedonic qualities of gazeover and mouseover, which showed significant higher ratings in hedonic quality for the gazeover technique. We conclude by discussing limitations and implications of our findings.

References

[1]
Antti Aaltonen, Aulikki Hyrskykari, and Kari-Jouko Raiha. 1998. 101 Spots, or How Do Users Read Menus?. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '98). ACM Press/Addison-Wesley Publishing Co., New York, NY, USA, 132--139.
[2]
Sharifa Alghowinem, Majdah AlShehri, Roland Goecke, and Michael Wagner. 2014. Exploring Eye Activity as an Indication of Emotional States Using an Eye-Tracking Sensor. Springer International Publishing, Cham, 261--276.
[3]
Ilhan Aslan and Elisabeth André. 2017. Pre-touch Proxemics: Moving the Design Space of Touch Targets from Still Graphics Towards Proxemic Behaviors. In Proceedings of the 19th ACM International Conference on Multimodal Interaction (ICMI 2017). ACM, New York, NY, USA, 101--109.
[4]
Ilhan Aslan, Martin Murer, Verena Fuchsberger, Andrew Fugard, and Manfred Tscheligi. 2013. Drag and Drop the Apple: The Semantic Weight of Words and Images in Touch-based Interaction. In Proceedings of the 7th International Conference on Tangible, Embedded and Embodied Interaction (TEI '13). ACM, New York, NY, USA, 159--166.
[5]
Ilhan Aslan, Feiyu Xu, Hans Uszkoreit, Antonio Krüger, and Jörg Steffen. 2005. COMPASS2008: Multimodal, Multilingual and Crosslingual Interaction for Mobile Tourist Guide Applications. In Intelligent Technologies for Interactive Entertainment, bibfieldeditorMark Maybury, Oliviero Stock, and Wolfgang Wahlster (Eds.). Springer Berlin Heidelberg, Berlin, Heidelberg, 3--12.
[6]
Andreas Bulling, Jamie A. Ward, Hans Gellersen, and Gerhard Tröster. 2009. Eye Movement Analysis for Activity Recognition. In Ubiquitous Computing (UbiComp), Conference Proceedings. ACM, New York, New York, USA, 41--50.
[7]
Guy Thomas Buswell. 1935. How People Look at Pictures: A Study of the Psychology of Perception in Art. University of Chicago Press, Chicago, Illinois.
[8]
S. S. Mozafari Chanijani, S. S. Bukhari, and A. Dengel. 2015. Analysis of text layout quality using wearable eye trackers 2015 IEEE International Conference on Multimedia Expo Workshops (ICMEW). 1--6.
[9]
Laura Cowen, Linden J.s Ball, and Judy Delin. 2002. An Eye Movement Analysis of Web Page Usability. In People and Computers XVI - Memorable Yet Invisible, bibfieldeditorXristine Faulkner, Janet Finlay, and Franccoise Détienne (Eds.). Springer London, London, 317--335.
[10]
Michael Dietz, Maha El Garf, Ionut Damian, and Elisabeth André. 2016 a. Exploring Eye-Tracking-Driven Sonification for the Visually Impaired Proceedings of the 7th Augmented Human International Conference 2016 (AH '16). ACM, New York, NY, USA, Article bibinfoarticleno5, bibinfonumpages8 pages.
[11]
Michael Dietz, Daniel Schork, and Elisabeth André. 2016 b. Exploring Eye-Tracking-Based Detection of Visual Search for Elderly People Intelligent Environments (IE), Conference Proceedings. IEEE, 151--154.
[12]
Michael Dietz, Daniel Schork, Ionut Damian, Anika Steinert, Marten Haesner, and Elisabeth André. 2017. Automatic Detection of Visual Search for the Elderly using Eye and Head Tracking Data. KI - Künstliche Intelligenz Vol. 31, 4 (01 Nov. 2017), 339--348.
[13]
David Fono and Roel Vertegaal. 2005. EyeWindows: Evaluation of Eye-controlled Zooming Windows for Focus Selection Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '05). ACM, New York, NY, USA, 151--160.
[14]
William W. Gaver. 1991. Technology Affordances. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '91). ACM, New York, NY, USA, 79--84.
[15]
Teresia Gowases, Roman Bednarik, and Markku Tukiainen. 2008. Gaze vs. mouse in games: The effects on user experience. (2008).
[16]
John Paulin Hansen, Anders Sewerin Johansen, Dan Witzner Hansen, Kenji Itoh, and Satoru Mashino. 2003. Command without a click: Dwell time typing by mouse and gaze selections Proceedings of Human-Computer Interaction--INTERACT. 121--128.
[17]
Marc Hassenzahl, Franz Koller, and Michael Burmester. 2008. Der User Experience (UX) auf der Spur: Zum Einsatz von www. attrakdiff. de. Tagungsband UP08 (2008).
[18]
Marc Hassenzahl, Axel Platz, Michael Burmester, and Katrin Lehner. 2000. Hedonic and ergonomic quality aspects determine a software's appeal CHI.
[19]
Edwin L. Hutchins, James D. Hollan, and Donald A. Norman. 1985. Direct Manipulation Interfaces. Hum.-Comput. Interact. Vol. 1, 4 (Dec. 1985), 311--338.
[20]
T. E. Hutchinson, K. P. White, W. N. Martin, K. C. Reichert, and L. A. Frey. 1989. Human-computer interaction using eye-gaze input. IEEE Transactions on Systems, Man, and Cybernetics Vol. 19, 6 (Nov. 1989), 1527--1534.
[21]
Robert J.K. Jacob, Audrey Girouard, Leanne M. Hirshfield, Michael S. Horn, Orit Shaer, Erin Treacy Solovey, and Jamie Zigelbaum. 2008. Reality-based Interaction: A Framework for post-WIMP Interfaces Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '08). ACM, New York, NY, USA, 201--210.
[22]
Robert J. K. Jacob. 1990. What You Look at is What You Get: Eye Movement-based Interaction Techniques. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '90). ACM, New York, NY, USA, 11--18.
[23]
Marcel Adam Just and Patricia A. Carpenter. 1976. Eye fixations and cognitive processes. Cognitive Psychology Vol. 8, 4 (1976), 441--480.
[24]
Chris L Kleinke. 1986. Gaze and Eye Contact: A Research Review. Psychological bulletin Vol. 100, 1 (1986), 78.
[25]
Manu Kumar, Andreas Paepcke, and Terry Winograd. 2007. EyePoint: Practical Pointing and Selection Using Gaze and Keyboard Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '07). ACM, New York, NY, USA, 421--430.
[26]
Manu Kumar and Terry Winograd. 2007. Gaze-enhanced Scrolling Techniques. In Proceedings of the 20th Annual ACM Symposium on User Interface Software and Technology (UIST '07). ACM, New York, NY, USA, 213--216.
[27]
Chris Lankford. 2000. Effective Eye-gaze Input into Windows. In Proceedings of the 2000 Symposium on Eye Tracking Research & Applications (ETRA '00). ACM, New York, NY, USA, 23--27.
[28]
P"aivi "Majaranta, Ulla-Kaija Ahola, and Oleg vSpakov. 2009. Fast Gaze Typing with an Adjustable Dwell Time. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '09). ACM, New York, NY, USA, 357--360.
[29]
P"aivi Majaranta and Andreas Bulling. 2014. Eye Tracking and Eye-Based Human-Computer Interaction. Springer London, London, 39--65.
[30]
Julio C. Mateo, Javier San Agustin, and John Paulin Hansen. 2008. Gaze Beats Mouse: Hands-free Selection by Combining Gaze and Emg CHI '08 Extended Abstracts on Human Factors in Computing Systems (CHI EA '08). ACM, New York, NY, USA, 3039--3044.
[31]
Bernhard Maurer, Ilhan Aslan, Martin Wuchse, Katja Neureiter, and Manfred Tscheligi. 2015. Gaze-Based Onlooker Integration: Exploring the In-Between of Active Player and Passive Spectator in Co-Located Gaming. In Proceedings of the 2015 Annual Symposium on Computer-Human Interaction in Play (CHI PLAY '15). ACM, New York, NY, USA, 163--173.
[32]
Darius Miniotas, Oleg vSpakov, Ivan Tugoy, and I. Scott MacKenzie. 2006. Speech-augmented Eye Gaze Interaction with Small Closely Spaced Targets Proceedings of the 2006 Symposium on Eye Tracking Research & Applications (ETRA '06). ACM, New York, NY, USA, 67--72.
[33]
Aanand Nayyar, Utkarsh Dwivedi, Karan Ahuja, Nitendra Rajput, Seema Nagar, and Kuntal Dey. 2017. OptiDwell: Intelligent Adjustment of Dwell Click Time Proceedings of the 22Nd International Conference on Intelligent User Interfaces (IUI '17). ACM, New York, NY, USA, 193--204.
[34]
Don Norman. 2013. The design of everyday things: Revised and expanded edition. Basic Books (AZ).
[35]
Ben Shneiderman. 1981. Direct Manipulation: A Step Beyond Programming Languages (Abstract Only). SIGSOC Bull. Vol. 13, 2--3, 143--.
[36]
J. David Smith and T. C. Nicholas Graham. 2006. Use of Eye Movements for Video Game Control. In Proceedings of the 2006 ACM SIGCHI International Conference on Advances in Computer Entertainment Technology (ACE '06). ACM, New York, NY, USA, Article bibinfoarticleno20.
[37]
Sophie Stellmach and Raimund Dachselt. 2012. Look & Touch: Gaze-supported Target Acquisition Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '12). ACM, New York, NY, USA, 2981--2990.
[38]
Takumi Toyama, Daniel Sonntag, Andreas Dengel, Takahiro Matsuda, Masakazu Iwamura, and Koichi Kise. 2014. A Mixed Reality Head-mounted Text Translation System Using Eye Gaze Input Proceedings of the 19th International Conference on Intelligent User Interfaces (IUI '14). ACM, New York, NY, USA, 329--334.
[39]
Colin Ware and Harutune H. Mikaelian. 1987. An Evaluation of an Eye Tracker As a Device for Computer Input2 Proceedings of the SIGCHI/GI Conference on Human Factors in Computing Systems and Graphics Interface (CHI '87). ACM, New York, NY, USA, 183--188.
[40]
Yanxia Zhang, Sophie Stellmach, Abigail Sellen, and Andrew Blake. 2015. The Costs and Benefits of Combining Gaze and Hand Gestures for Remote Interaction 15th IFIP TC 13 International Conference on Human-Computer Interaction -- INTERACT 2015 - Volume 9298. Springer-Verlag New York, Inc., New York, NY, USA, 570--577.
[41]
W. L. Zheng, B. N. Dong, and B. L. Lu. 2014. Multimodal emotion recognition using EEG and eye tracking data 2014 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society. 5040--5043.

Cited By

View all
  • (2021)Dependability and Safety: Two Clouds in the Blue Sky of Multimodal InteractionProceedings of the 2021 International Conference on Multimodal Interaction10.1145/3462244.3479881(781-787)Online publication date: 18-Oct-2021
  • (2020)“But Wait, There’s More!” a Deeper Look into Temporally Placing Touch Gesture SignifiersInteractivity, Game Creation, Design, Learning, and Innovation10.1007/978-3-030-53294-9_20(290-308)Online publication date: 28-Jul-2020
  • (2019)Mouse, touch, or fichProceedings of the 18th International Conference on Mobile and Ubiquitous Multimedia10.1145/3365610.3365622(1-7)Online publication date: 26-Nov-2019
  • Show More Cited By

Index Terms

  1. Gazeover -- Exploring the UX of Gaze-triggered Affordance Communication for GUI Elements

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    ICMI '18: Proceedings of the 20th ACM International Conference on Multimodal Interaction
    October 2018
    687 pages
    ISBN:9781450356923
    DOI:10.1145/3242969
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    • SIGCHI: Specialist Interest Group in Computer-Human Interaction of the ACM

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 02 October 2018

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. gaze
    2. user experience

    Qualifiers

    • Short-paper

    Conference

    ICMI '18
    Sponsor:
    • SIGCHI

    Acceptance Rates

    ICMI '18 Paper Acceptance Rate 63 of 149 submissions, 42%;
    Overall Acceptance Rate 453 of 1,080 submissions, 42%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)20
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 18 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2021)Dependability and Safety: Two Clouds in the Blue Sky of Multimodal InteractionProceedings of the 2021 International Conference on Multimodal Interaction10.1145/3462244.3479881(781-787)Online publication date: 18-Oct-2021
    • (2020)“But Wait, There’s More!” a Deeper Look into Temporally Placing Touch Gesture SignifiersInteractivity, Game Creation, Design, Learning, and Innovation10.1007/978-3-030-53294-9_20(290-308)Online publication date: 28-Jul-2020
    • (2019)Mouse, touch, or fichProceedings of the 18th International Conference on Mobile and Ubiquitous Multimedia10.1145/3365610.3365622(1-7)Online publication date: 26-Nov-2019
    • (2019)Creativity Support and Multimodal Pen-based Interaction2019 International Conference on Multimodal Interaction10.1145/3340555.3353738(135-144)Online publication date: 14-Oct-2019

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media