Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/2207676.2208709acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

Look & touch: gaze-supported target acquisition

Published: 05 May 2012 Publication History

Abstract

While eye tracking has a high potential for fast selection tasks, it is often regarded as error-prone and unnatural, especially for gaze-only interaction. To improve on that, we propose gaze-supported interaction as a more natural and effective way combining a user's gaze with touch input from a handheld device. In particular, we contribute a set of novel and practical gaze-supported selection techniques for distant displays. Designed according to the principle gaze suggests, touch confirms they include an enhanced gaze-directed cursor, local zoom lenses and more elaborated techniques utilizing manual fine positioning of the cursor via touch. In a comprehensive user study with 24 participants, we investigated the potential of these techniques for different target sizes and distances. All novel techniques outperformed a simple gaze-directed cursor and showed individual advantages. In particular those techniques using touch for fine cursor adjustments (MAGIC touch) and for cycling through a list of possible close-to-gaze targets (MAGIC tab) demonstrated a high overall performance and usability.

Supplementary Material

MP4 File (paperfile405-3.mp4)
Supplemental video for “Look & touch: gaze-supported target acquisition”

References

[1]
Adams, N., Witkowski, M., and Spence, R. The inspection of very large images by eye-gaze control. In Proc. of AVI'08 (2008), 111--118.
[2]
Ashmore, M., Duchowski, A. T., and Shoemaker, G. Efficient eye pointing with a Fisheye lens. In Proc. of Graphics Interface 2005, GI'05 (2005), 203--210.
[3]
Bates, R., and Istance, H. Zooming interfaces!: enhancing the performance of eye controlled pointing devices. In Proc. of Assets'02, ACM (2002), 119--126.
[4]
Bieg, H.-J., Chuang, L. L., Fleming, R. W., Reiterer, H., and Bülthoff, H. H. Eye and pointer coordination in search and selection tasks. In Proc. of ETRA'10, ACM (2010), 89--92.
[5]
Castellina, E., and Corno, F. Multimodal gaze interaction in 3D virtual environments. In COGAIN'08 (2008), 33--37.
[6]
Cho, S.-J., Choi, C., Sung, Y., Lee, K., Kim, Y.-B., and Murray-Smith, R. Dynamics of tilt-based browsing on mobile devices. In Proc. of CHI'07, ACM (2007), 1947--1952.
[7]
Drewes, H., and Schmidt, A. The MAGIC touch: Combining MAGIC-pointing with a touch-sensitive mouse. In Proc. of INTERACT'09, Springer-Verlag (2009), 415--428.
[8]
Fono, D., and Vertegaal, R. EyeWindows: evaluation of eye-controlled zooming windows for focus selection. In Proc. of CHI'05 (2005), 151--160.
[9]
Hansen, D. W., Skovsgaard, H. H. T., Hansen, J. P., and Mollenbach, E. Noise tolerant selection by gaze-controlled pan and zoom in 3D. In Proc. of ETRA'08, ACM (2008), 205--212.
[10]
Hansen, J. P., Johansen, A. S., Hansen, D. W., Ito, K., and Mashino, S. Command without a click: Dwell time typing by mouse and gaze selections. In Proc. of INTERACT'03, IOS Press (2003), 121--128.
[11]
Jacob, R. J. K. What you look at is what you get: eye movement-based interaction techniques. In Proc. of CHI'90, ACM (1990), 11--18.
[12]
Kabbash, P., and Buxton, W. A. S. The prince technique: Fitts' law and selection using area cursors. In Proc. of CHI'95, ACM Press/Addison-Wesley Publishing Co. (1995), 273--279.
[13]
Kumar, M., Paepcke, A., and Winograd, T. EyePoint: practical pointing and selection using gaze and keyboard. In Proc. of CHI'07, ACM (2007), 421--430.
[14]
Lankford, C. Effective eye-gaze input into windows. In Proc. of ETRA'00, ACM (2000), 23--27.
[15]
Miniotas, D. Application of Fitts' law to eye gaze interaction. In Proc. of CHI EA'00, ACM (2000), 339--340.
[16]
Miniotas, D., Spakov, O., and MacKenzie, I. S. Eye gaze interaction with expanding targets. In Proc. of CHI EA'04, ACM (2004), 1255--1258.
[17]
Monden, A., Matsumoto, K., and Yamato, M. Evaluation of gaze-added target selection methods suitable for general GUIs. Int. J. Comput. Appl. Technol. 24 (June 2005), 17--24.
[18]
Salvucci, D. D., and Anderson, J. R. Intelligent gaze-added interfaces. In Proc. of CHI'00, ACM (2000), 273--280.
[19]
San Agustin, J., Hansen, J. P., and Tall, M. Gaze-based interaction with public displays using off-the-shelf components. In Proc. of Ubicomp'10, ACM (2010), 377--378.
[20]
Skovsgaard, H., Mateo, J. C., Flach, J. M., and Hansen, J. P. Small-target selection with gaze alone. In Proc. of ETRA'10, ACM (2010), 145--148.
[21]
Smith, B. A., Ho, J., Ark, W., and Zhai, S. Hand eye coordination patterns in target selection. In Proc. of ETRA'00, ACM (2000), 117--122.
[22]
Stellmach, S., Stober, S., Nürnberger, A., and Dachselt, R. Designing gaze-supported multimodal interactions for the exploration of large image collections. In Proc. of NGCA'11, ACM (2011), 1--8.
[23]
Spakov, O. Comparison of gaze-to-objects mapping algorithms. In Proc. of NGCA'11, ACM (2011), 1--8.
[24]
Ware, C., and Mikaelian, H. H. An evaluation of an eye tracker as a device for computer input. In Proc. of CHI'87, ACM (1987), 183--188.
[25]
Yamato, M., Inoue, K., Monden, A., Torii, K., and Matsumoto, K.-i. Button selection for general GUIs using eye and hand together. In Proc. of AVI'00, ACM (2000), 270--273.
[26]
Yoo, B., Han, J.-J., Choi, C., Yi, K., Suh, S., Park, D., and Kim, C. 3D user interface combining gaze and hand gestures for large-scale display. In Proc. of CHI EA'10, ACM (2010), 3709--3714.
[27]
Zhai, S., Morimoto, C., and Ihde, S. Manual and gaze input cascaded (MAGIC) pointing. In Proc. of CHI'99, ACM (1999), 246--253.
[28]
Zhang, X., Ren, X., and Zha, H. Improving eye cursor's stability for eye pointing tasks. In Proc. of CHI'08, ACM (2008), 525--534.

Cited By

View all
  • (2024)Eye-Hand Movement of Objects in Near Space Extended RealityProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676446(1-13)Online publication date: 13-Oct-2024
  • (2024)Hands-on, Hands-off: Gaze-Assisted Bimanual 3D InteractionProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676331(1-12)Online publication date: 13-Oct-2024
  • (2024)Exploiting Physical Referent Features as Input for Multidimensional Data Selection in Augmented RealityACM Transactions on Computer-Human Interaction10.1145/364861331:4(1-40)Online publication date: 19-Sep-2024
  • Show More Cited By

Index Terms

  1. Look & touch: gaze-supported target acquisition

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    CHI '12: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
    May 2012
    3276 pages
    ISBN:9781450310154
    DOI:10.1145/2207676
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 05 May 2012

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. gaze input
    2. gaze-supported interaction
    3. mobile touch interaction
    4. selection
    5. target acquisition

    Qualifiers

    • Research-article

    Conference

    CHI '12
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

    Upcoming Conference

    CHI '25
    CHI Conference on Human Factors in Computing Systems
    April 26 - May 1, 2025
    Yokohama , Japan

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)175
    • Downloads (Last 6 weeks)29
    Reflects downloads up to 18 Nov 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Eye-Hand Movement of Objects in Near Space Extended RealityProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676446(1-13)Online publication date: 13-Oct-2024
    • (2024)Hands-on, Hands-off: Gaze-Assisted Bimanual 3D InteractionProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676331(1-12)Online publication date: 13-Oct-2024
    • (2024)Exploiting Physical Referent Features as Input for Multidimensional Data Selection in Augmented RealityACM Transactions on Computer-Human Interaction10.1145/364861331:4(1-40)Online publication date: 19-Sep-2024
    • (2024)MouseRing: Always-available Touchpad Interaction with IMU RingsProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642225(1-19)Online publication date: 11-May-2024
    • (2024)Cone&Bubble: Evaluating Combinations of Gaze, Head and Hand Pointing for Target Selection in Dense 3D Environments2024 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)10.1109/VRW62533.2024.00126(642-649)Online publication date: 16-Mar-2024
    • (2024)BaggingHook: Selecting Moving Targets by Pruning Distractors Away for Intention-Prediction Heuristics in Dense 3D Environments2024 IEEE Conference Virtual Reality and 3D User Interfaces (VR)10.1109/VR58804.2024.00110(913-923)Online publication date: 16-Mar-2024
    • (2024)Eye-Hand Typing: Eye Gaze Assisted Finger Typing via Bayesian Processes in ARIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2024.337210630:5(2496-2506)Online publication date: May-2024
    • (2024)Designing a 3D gestural interface to support user interaction with time-oriented data as immersive 3D radar chartsVirtual Reality10.1007/s10055-023-00913-w28:1Online publication date: 23-Jan-2024
    • (2024)MazeMind: Exploring the Effects of Hand Gestures and Eye Gazing on Cognitive Load and Task Efficiency in an Augmented Reality EnvironmentDesign Computing and Cognition’2410.1007/978-3-031-71922-6_7(105-120)Online publication date: 28-Sep-2024
    • (2024)MASTER-XR: Mixed Reality Ecosystem for Teaching Robotics in ManufacturingIntegrated Systems: Data Driven Engineering10.1007/978-3-031-53652-6_10(167-182)Online publication date: 17-Sep-2024
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media