Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1007/978-3-540-88322-7_4guideproceedingsArticle/Chapter ViewAbstractPublication PagesConference Proceedingsacm-pubtype
Article

EyeMote --- Towards Context-Aware Gaming Using Eye Movements Recorded from Wearable Electrooculography

Published: 20 October 2008 Publication History

Abstract

Physical activity has emerged as a novel input modality for so-called active video games. Input devices such as music instruments, dance mats or the Wii accessories allow for novel ways of interaction and a more immersive gaming experience. In this work we describe how eye movements recognised from electrooculographic (EOG) signals can be used for gaming purposes in three different scenarios. In contrast to common video-based systems, EOG can be implemented as a wearable and light-weight system which allows for long-term use with unconstrained simultaneous physical activity. In a stationary computer game we show that eye gestures of varying complexity can be recognised online with equal performance to a state-of-the-art video-based system. For pervasive gaming scenarios, we show how eye movements can be recognised in the presence of signal artefacts caused by physical activity such as walking. Finally, we describe possible future context-aware games which exploit unconscious eye movements and show which possibilities this new input modality may open up.

References

[1]
Bulling, A., Ward, J.A., Gellersen, H., Tröster, G.: Robust Recognition of Reading Activity in Transit Using Wearable Electrooculography. In: Proc. of the 6th International Conference on Pervasive Computing (Pervasive 2008), pp. 19-37 (2008).
[2]
Selker, T.: Visual attentive interfaces. BT Technology Journal 22(4), 146-150 (2004).
[3]
Henderson, J.M.: Human gaze control during real-world scene perception. Trends in Cognitive Sciences 7(11), 498-504 (2003).
[4]
Melcher, D., Kowler, E.: Visual scene memory and the guidance of saccadic eye movements. Vision Research 41(25-26), 3597-3611 (2001).
[5]
Chun, M.M.: Contextual cueing of visual attention. Trends in Cognitive Sciences 4(5), 170-178 (2000).
[6]
Zhai, S., Morimoto, C., Ihde, S.: Manual and gaze input cascaded (MAGIC) pointing. In: Proc. of the SIGCHI Conference on Human Factors in Computing Systems (CHI 1999), pp. 246-253 (1999).
[7]
Qvarfordt, P., Zhai, S.: Conversing with the user based on eye-gaze patterns. In: Proc. of the SIGCHI Conference on Human Factors in Computing Systems (CHI 2005), pp. 221-230 (2005).
[8]
Drewes, H., Schmidt, A.: Interacting with the Computer Using Gaze Gestures. In: Baranauskas, C., Palanque, P., Abascal, J., Barbosa, S.D.J. (eds.) INTERACT 2007. LNCS, vol. 4663, pp. 475-488. Springer, Heidelberg (2007).
[9]
Jacob, R.J.K.: What you look at is what you get: eye movement-based interaction techniques. In: Proc. of the SIGCHI Conference on Human Factors in Computing Systems (CHI 1990), pp. 11-18 (1990).
[10]
Smith, J.D., Graham, T.C.N.: Use of eye movements for video game control. In: Proc. of the International Conference on Advances in Computer Entertainment Technology (ACE 2006), pp. 20-27 (2006).
[11]
Charness, N., Reingold, E.M., Pomplun, M., Stampe, D.M.: The perceptual aspect of skilled performance in chess: Evidence from eye movements. Memory and Cognition 29(7), 1146-1152 (2001).
[12]
Lin, C.-S., Huan, C.-C., Chan, C.-N., Yeh, M.-S., Chiu, C.-C.: Design of a computer game using an eye-tracking device for eye's activity rehabilitation. Optics and Lasers in Engineering 42(1), 91-108 (2004).
[13]
Wijesoma, W.S., Kang, S.W., Ong, C.W., Balasuriya, A.P., Koh, T.S., Kow, K.S.: EOG based control of mobile assistive platforms for the severely disabled. In: Proc. of the International Conference on Robotics and Biomimetics (ROBIO 2005), pp. 490-494 (2005).
[14]
Mizuno, F., Hayasaka, T., Tsubota, K., Wada, S., Yamaguchi, T.: Development of hands-free operation interface for wearable computer-hyper hospital at home. In: Proc. of the 25th Annual International Conference of the Engineering in Medicine and Biology Society (EMBS 2003), pp. 3740-3743 (2003).
[15]
Patmore, D.W., Knapp, R.B.: Towards an EOG-based eye tracker for computer control. In: Proc. of the 3rd International ACM Conference on Assistive Technologies (Assets 1998), pp. 197-203 (1998).
[16]
Bulling, A., Roggen, D., Tröster, G.: It's in Your Eyes - Towards Context-Awareness and Mobile HCI Using Wearable EOG Goggles. In: Proc. of the 10th International Conference on Ubiquitous Computing (UbiComp 2008), pp. 84-93 (2008).
[17]
Magerkurth, C., Cheok, A.D., Mandryk, R.L., Nilsen, T.: Pervasive games: bringing computer entertainment back to the real world. Computers in Entertainment (CIE 2005) 3(3), 4 (2005).
[18]
Isokoski, P., Hyrskykari, A., Kotkaluoto, S., Martin, B.: Gamepad and Eye Tracker Input in FPS Games: Data for the First 50 Minutes. In: Proc. of the 3rd Conference on Communication by Gaze Interaction (COGAIN 2007), pp. 78-81 (2007).
[19]
Hayhoe, M., Ballard, D.: Eye movements in natural behavior. Trends in Cognitive Sciences 9, 188-194 (2005).
[20]
Csíkszentmihályi, M.: Flow: The Psychology of Optimal Experience. Harper Collins, New York (1991).
[21]
Liversedge, S.P., Findlay, J.M.: Saccadic eye movements and cognition. Trends in Cognitive Sciences 4(1), 6-14 (2000).
[22]
Skotte, J., Nøjgaard, J., Jørgensen, L., Christensen, K., Sjøgaard, G.: Eye blink frequency during different computer tasks quantified by electrooculography. European Journal of Applied Physiology 99(2), 113-119 (2007).
[23]
Caffier, P.P., Erdmann, U., Ullsperger, P.: Experimental evaluation of eye-blink parameters as a drowsiness measure. European Journal of Applied Physiology 89(3), 319-325 (2003).

Cited By

View all
  • (2018)Error-aware gaze-based interfaces for robust mobile gaze interactionProceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications10.1145/3204493.3204536(1-10)Online publication date: 14-Jun-2018
  • (2016)Possibilities and challenges with eye tracking in video games and virtual reality applicationsSIGGRAPH ASIA 2016 Courses10.1145/2988458.2988466(1-150)Online publication date: 28-Nov-2016
  • (2014)Dynamically adapting an AI game engine based on players' eye movements and strategiesProceedings of the 2014 ACM SIGCHI symposium on Engineering interactive computing systems10.1145/2607023.2607029(3-12)Online publication date: 17-Jun-2014
  • Show More Cited By
  1. EyeMote --- Towards Context-Aware Gaming Using Eye Movements Recorded from Wearable Electrooculography

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image Guide Proceedings
    Proceedings of the 2nd International Conference on Fun and Games
    October 2008
    202 pages
    ISBN:9783540883210
    • Editors:
    • Panos Markopoulos,
    • Boris Ruyter,
    • Wijnand Ijsselsteijn,
    • Duncan Rowland

    Publisher

    Springer-Verlag

    Berlin, Heidelberg

    Publication History

    Published: 20 October 2008

    Qualifiers

    • Article

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)0
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 19 Nov 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2018)Error-aware gaze-based interfaces for robust mobile gaze interactionProceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications10.1145/3204493.3204536(1-10)Online publication date: 14-Jun-2018
    • (2016)Possibilities and challenges with eye tracking in video games and virtual reality applicationsSIGGRAPH ASIA 2016 Courses10.1145/2988458.2988466(1-150)Online publication date: 28-Nov-2016
    • (2014)Dynamically adapting an AI game engine based on players' eye movements and strategiesProceedings of the 2014 ACM SIGCHI symposium on Engineering interactive computing systems10.1145/2607023.2607029(3-12)Online publication date: 17-Jun-2014
    • (2010)Gazing at gamesACM SIGGRAPH 2010 Courses10.1145/1837101.1837106(1-160)Online publication date: 26-Jul-2010

    View Options

    View options

    Login options

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media