Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/2493432.2493477acmconferencesArticle/Chapter ViewAbstractPublication PagesubicompConference Proceedingsconference-collections
research-article

Pursuits: spontaneous interaction with displays based on smooth pursuit eye movement and moving targets

Published: 08 September 2013 Publication History

Abstract

Although gaze is an attractive modality for pervasive interactions, the real-world implementation of eye-based interfaces poses significant challenges, such as calibration. We present Pursuits, an innovative interaction technique that enables truly spontaneous interaction with eye-based interfaces. A user can simply walk up to the screen and readily interact with moving targets. Instead of being based on gaze location, Pursuits correlates eye pursuit movements with objects dynamically moving on the interface. We evaluate the influence of target speed, number and trajectory and develop guidelines for designing Pursuits-based interfaces. We then describe six realistic usage scenarios and implement three of them to evaluate the method in a usability study and a field study. Our results show that Pursuits is a versatile and robust technique and that users can interact with Pursuits-based interfaces without prior knowledge or preparation phase.

References

[1]
R. Ballagas, M. Rohs, and J. G. Sheridan. Sweep and point and shoot: phonecam-based interactions for large public displays. In CHI EA '05, pages 1200--1203, 2005. ACM.
[2]
A. Bulling, F. Alt, and A. Schmidt. Increasing the security of gaze-based cued-recall graphical passwords using saliency masks. In CHI '12, pages 3011--3020, 2012. ACM.
[3]
K. Cheverst, A. Dix, D. Fitton, C. Kray, M. Rouncefield, C. Sas, G. Saslis-Lagoudakis, and J. G. Sheridan. Exploring bluetooth based mobile phone interaction with the hermes photo display. In MobileHCI '05, pages 47--54, 2005. ACM.
[4]
H. Drewes and A. Schmidt. Interacting with the computer using gaze gestures. In INTERACT '07, pages 475--488. Springer, 2007.
[5]
J.-D. Fekete, N. Elmqvist, and Y. Guiard. Motion-pointing: target selection using elliptical motions. In CHI '09, pages 289--298. ACM, 2009.
[6]
D. Fono and R. Vertegaal. Eyewindows: evaluation of eye-controlled zooming windows for focus selection. In CHI '05, pages 151--160. ACM, 2005.
[7]
K. R. Gegenfurtner, D. Xing, B. H. Scott, and M. J. Hawken. A comparison of pursuit eye movement and perceptual performance in speed discrimination. Journal of Vision, 3:865--876, 2003.
[8]
T. J. Gunn, P. Irani, and J. Anderson. An evaluation of techniques for selecting moving targets. In CHI EA '09, pages 3329--3334, 2009. ACM.
[9]
D. W. Hansen, H. H. T. Skovsgaard, J. P. Hansen, and E. Møllenbach. Noise tolerant selection by gaze-controlled pan and zoom in 3d. In Proc. of the Symposium on Eye Tracking Research and Applications, ETRA '08, pages 205--212. ACM, 2008.
[10]
A. Hyrskykari, H. Istance, and S. Vickers. Gaze gestures or dwell-based interaction? In Proc. of the Symposium on Eye Tracking Research and Applications, ETRA '12, pages 229--232, 2012. ACM.
[11]
M. P. Hyrskykari, A. and K.-J. Räihä. From gaze control to attentive interfaces. In Proc. of HCII. Lawrence Erlbaum Associates, 2005.
[12]
R. J. K. Jacob. What you look at is what you get: eye movement-based interaction techniques. In CHI '90, pages 11--18. ACM, 1990.
[13]
R. Johnson, K. O'Hara, A. Sellen, C. Cousins, and A. Criminisi. Exploring the potential for touchless interaction in image-guided interventional radiology. In CHI'11, pages 3323--3332. ACM.
[14]
R. J. Leigh and D. S. Zee. The neurology of eye movements, chapter 3. Number 55 in Contemporary neurology series. FA Davis, 2 edition, 1991.
[15]
P. Maglio, T. Matlock, C. Campbell, S. Zhai, and B. Smith. Gaze and speech in attentive user interfaces. In T. Tan, Y. Shi, and W. Gao, editors, Advances in Multimodal Interfaces -- ICMI 2000, volume 1948 of Lecture Notes in Computer Science, pages 1--7. Springer Berlin Heidelberg, 2000.
[16]
D. Miniotas, O.vSpakov, and I. S. MacKenzie. Eye gaze interaction with expanding targets. In CHI EA '04, pages 1255--1258, 2004. ACM.
[17]
D. Mould and C. Gutwin. The effects of feedback on targeting with multiple moving targets. In GI '04, pages 25--32, 2004. Canadian Human-Computer Communications Society.
[18]
A. Pavlovych and W. Stuerzlinger. Target following performance in the presence of latency, jitter, and signal dropouts. In GI '11, pages 33--40, 2011. Canadian Human-Computer Communications Society.
[19]
P. Peltonen, E. Kurvinen, A. Salovaara, G. Jacucci, T. Ilmonen, J. Evans, A. Oulasvirta, and P. Saarikko. It's mine, don't touch!: interactions at a large multi-touch display in a city centre. In CHI '08, pages 1285--1294, 2008. ACM.
[20]
K. Rayner. Eye movements in reading and information processing: 20 years of research. Psychological Bulletin, pages 372--422, 1998.
[21]
D. Robinson. The mechanics of human smooth pursuit eye movement. The Journal of Physiology, 180(3):569--591, Oct. 1965.
[22]
R. Ruddarraju, A. Haro, K. Nagel, Q. T. Tran, I. A. Essa, G. Abowd, and E. D. Mynatt. Perceptual user interfaces using vision-based eye tracking. In Proc. of the 5th international conference on Multimodal interfaces, ICMI '03, pages 227--233, 2003. ACM.
[23]
J. S. Shell, R. Vertegaal, and A. W. Skaburskis. Eyepliances: attention-seeking devices that respond to visual attention. In CHI EA '03, pages 770--771, 2003. ACM.
[24]
L. E. Sibert and R. J. K. Jacob. Evaluation of eye gaze interaction. In CHI '00, pages 281--288, 2000. ACM.
[25]
S. Stellmach and R. Dachselt. Look & touch: gaze-supported target acquisition. In CHI '12, pages 2981--2990. ACM, 2012.
[26]
O. Tuisku, P. Majaranta, P. Isokoski, and K.-J. R\"aih\"a. Now dasher! dash away!: longitudinal study of fast text entry by eye gaze. In Proc. of the Symposium on Eye tracking Research and Applications, ETRA '08, pages 19--26, 2008. ACM.
[27]
R. Vertegaal. Attentive user interfaces. Commun. ACM, 46(3):30--33, Mar. 2003.
[28]
D. Vogel and R. Balakrishnan. Interactive public ambient displays: transitioning from implicit to explicit, public to personal, interaction with multiple users. In UIST '04, pages 137--146, 2004. ACM.
[29]
S. Zhai, C. Morimoto, and S. Ihde. Manual and gaze input cascaded (magic) pointing. In CHI '99, pages 246--253, 1999. ACM.
[30]
Y. Zhang, A. Bulling, and H. Gellersen. Sideways: A gaze interface for spontaneous interaction with situated displays. In CHI'13, 2013. ACM.

Cited By

View all
  • (2024)Multi-stage gaze-controlled virtual keyboard using eye trackingPLOS ONE10.1371/journal.pone.030983219:10(e0309832)Online publication date: 28-Oct-2024
  • (2024)EyeWithShut: Exploring Closed Eye Features to Estimate Eye PositionCompanion of the 2024 on ACM International Joint Conference on Pervasive and Ubiquitous Computing10.1145/3675094.3677605(157-161)Online publication date: 5-Oct-2024
  • (2024)40 Years of Eye Typing: Challenges, Gaps, and Emergent StrategiesProceedings of the ACM on Human-Computer Interaction10.1145/36555968:ETRA(1-19)Online publication date: 28-May-2024
  • Show More Cited By

Index Terms

  1. Pursuits: spontaneous interaction with displays based on smooth pursuit eye movement and moving targets

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    UbiComp '13: Proceedings of the 2013 ACM international joint conference on Pervasive and ubiquitous computing
    September 2013
    846 pages
    ISBN:9781450317702
    DOI:10.1145/2493432
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    In-Cooperation

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 08 September 2013

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. correlation
    2. eye movement
    3. eye-based interfaces
    4. smooth pursuits
    5. spontaneous interaction

    Qualifiers

    • Research-article

    Conference

    UbiComp '13
    Sponsor:

    Acceptance Rates

    UbiComp '13 Paper Acceptance Rate 92 of 394 submissions, 23%;
    Overall Acceptance Rate 764 of 2,912 submissions, 26%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)164
    • Downloads (Last 6 weeks)21
    Reflects downloads up to 23 Nov 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Multi-stage gaze-controlled virtual keyboard using eye trackingPLOS ONE10.1371/journal.pone.030983219:10(e0309832)Online publication date: 28-Oct-2024
    • (2024)EyeWithShut: Exploring Closed Eye Features to Estimate Eye PositionCompanion of the 2024 on ACM International Joint Conference on Pervasive and Ubiquitous Computing10.1145/3675094.3677605(157-161)Online publication date: 5-Oct-2024
    • (2024)40 Years of Eye Typing: Challenges, Gaps, and Emergent StrategiesProceedings of the ACM on Human-Computer Interaction10.1145/36555968:ETRA(1-19)Online publication date: 28-May-2024
    • (2024)Gaze on the Go: Effect of Spatial Reference Frame on Visual Target Acquisition During Physical Locomotion in Extended RealityProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642915(1-16)Online publication date: 11-May-2024
    • (2024)Snap, Pursuit and Gain: Virtual Reality Viewport Control by GazeProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642838(1-14)Online publication date: 11-May-2024
    • (2024)Towards an Eye-Brain-Computer Interface: Combining Gaze with the Stimulus-Preceding Negativity for Target Selections in XRProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3641925(1-17)Online publication date: 11-May-2024
    • (2024)EyeShadows: Peripheral Virtual Copies for Rapid Gaze Selection and Interaction2024 IEEE Conference Virtual Reality and 3D User Interfaces (VR)10.1109/VR58804.2024.00088(681-689)Online publication date: 16-Mar-2024
    • (2024)GazePuffer: Hands-Free Input Method Leveraging Puff Cheeks for VR2024 IEEE Conference Virtual Reality and 3D User Interfaces (VR)10.1109/VR58804.2024.00055(331-341)Online publication date: 16-Mar-2024
    • (2024)Filtering on the Go: Effect of Filters on Gaze Pointing Accuracy During Physical Locomotion in Extended RealityIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2024.345615330:11(7234-7244)Online publication date: 1-Nov-2024
    • (2024)Robust Object Selection in Spontaneous Gaze-Controlled Application Using Exponential Moving Average and Hidden Markov ModelIEEE Transactions on Human-Machine Systems10.1109/THMS.2024.341378154:5(485-498)Online publication date: Oct-2024
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media