Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3588015.3588423acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
research-article

One step closer to EEG based eye tracking

Published: 30 May 2023 Publication History

Abstract

In this paper, we present two approaches and algorithms that adapt areas of interest. We present a new deep neural network (DNN) that can be used to directly determine gaze position using EEG data. EEG-based eye tracking is a new and difficult research topic in the field of eye tracking, but it provides an alternative to image-based eye tracking with an input data set comparable to conventional image processing. The presented DNN exploits spatial dependencies of the EEG signal and uses convolutions similar to spatial filtering, which is used for preprocessing EEG signals. By this, we improve the direct gaze determination from the EEG signal compared to the state of the art by 3.5 cm MAE (Mean absolute error), but unfortunately still do not achieve a directly applicable system, since the inaccuracy is still significantly higher compared to image-based eye trackers.

Supplemental Material

MP4 File
Presentation video - short version

References

[1]
Lorenzo Bellizzi, Giuseppe Bevilacqua, Valerio Biancalana, Mario Carucci, Roberto Cecchi, Piero Chessa, Aniello Donniacuo, Marco Mandalà, and Leonardo Stiaccini. 2022. An innovative eye-tracker: Main features and demonstrative tests. Review of Scientific Instruments 93, 3 (2022), 035006.
[2]
Pieter Blignaut. 2014. Mapping the pupil-glint vector to gaze coordinates in a simple video-based eye tracker. Journal of Eye Movement Research 7, 1 (2014).
[3]
Mahmoud Dahmani, Muhammad EH Chowdhury, Amith Khandakar, Tawsifur Rahman, Khaled Al-Jayyousi, Abdalla Hefny, and Serkan Kiranyaz. 2020. An intelligent and low-cost eye-tracking system for motorized wheelchair control. Sensors 20, 14 (2020), 3936.
[4]
Olaf Dimigen, Werner Sommer, Annette Hohlfeld, Arthur M Jacobs, and Reinhold Kliegl. 2011. Coregistration of eye movements and EEG in natural reading: analyses and review.Journal of experimental psychology: General 140, 4 (2011), 552.
[5]
Xinfang Ding, Xinxin Yue, Rui Zheng, Cheng Bi, Dai Li, and Guizhong Yao. 2019. Classifying major depression patients and healthy controls using EEG, eye tracking and galvanic skin response data. Journal of affective Disorders 251 (2019), 156–161.
[6]
Alexey Dosovitskiy, Lucas Beyer, Alexander Kolesnikov, Dirk Weissenborn, Xiaohua Zhai, Thomas Unterthiner, Mostafa Dehghani, Matthias Minderer, Georg Heigold, Sylvain Gelly, 2020. An image is worth 16x16 words: Transformers for image recognition at scale. arXiv preprint arXiv:2010.11929 (2020).
[7]
Andrew T Duchowski and Andrew T Duchowski. 2017. Eye tracking methodology: Theory and practice. Springer.
[8]
Shaharam Eivazi and Maximilian Maurer. 2018. Eyemic: an eye tracker for surgical microscope. In Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications. 1–2.
[9]
Wolfgang Fuhl. 2022. GroupGazer: A Tool to Compute the Gaze per Participant in Groups with integrated Calibration to Map the Gaze Online to a Screen or Beamer Projection. arXiv preprint arXiv:2201.07692 (2022).
[10]
Wolfgang Fuhl, Thomas Kübler, Katrin Sippel, Wolfgang Rosenstiel, and Enkelejda Kasneci. 2015. Excuse: Robust pupil detection in real-world scenarios. In International conference on computer analysis of images and patterns. Springer, 39–51.
[11]
Wolfgang Fuhl, Thiago Santini, and Enkelejda Kasneci. 2017. Fast camera focus estimation for gaze-based focus control. arXiv preprint arXiv:1711.03306 (2017).
[12]
Wolfgang Fuhl, Daniel Weber, and Enkelejda Kasneci. 2022. Pistol: Pupil Invisible Supportive Tool to extract Pupil, Iris, Eye Opening, Eye Movements, Pupil and Iris Gaze Vector, and 2D as well as 3D Gaze. arXiv preprint arXiv:2201.06799 (2022).
[13]
Agostino Gibaldi, Mauricio Vanegas, Peter J Bex, and Guido Maiello. 2017. Evaluation of the Tobii EyeX Eye tracking controller and Matlab toolkit for research. Behavior research methods 49, 3 (2017), 923–946.
[14]
Benedikt Hosp, Shahram Eivazi, Maximilian Maurer, Wolfgang Fuhl, David Geisler, and Enkelejda Kasneci. 2020. Remoteeye: An open-source high-speed remote eye tracker. Behavior research methods 52, 3 (2020), 1387–1401.
[15]
Hong Hua, Prasanna Krishnaswamy, and Jannick P Rolland. 2006. Video-based eyetracking methods and algorithms in head-mounted displays. Optics Express 14, 10 (2006), 4328–4350.
[16]
Arash Kamangar. 2020. A literature review of customer behaviour patterns on e-commerce websites using an eye tracker. The Marketing Review 20, 1-2 (2020), 73–91.
[17]
Moritz Kassner, William Patera, and Andreas Bulling. 2014. Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction. In Proceedings of the 2014 ACM international joint conference on pervasive and ubiquitous computing: Adjunct publication. 1151–1160.
[18]
Ard Kastrati, Martyna Beata Plomecka, Roger Wattenhofer, and Nicolas Langer. 2021b. Using deep learning to classify saccade direction from brain activity. In ACM Symposium on Eye Tracking Research and Applications. 1–6.
[19]
Ard Kastrati, Martyna Martyna Beata Płomecka, Damián Pascual, Lukas Wolf, Victor Gillioz, Roger Wattenhofer, and Nicolas Langer. 2021a. EEGEyeNet: a Simultaneous Electroencephalography and Eye-tracking Dataset and Benchmark for Eye Movement Prediction. arXiv preprint arXiv:2111.05100 (2021).
[20]
Davis E. King. 2009. Dlib-ml: A Machine Learning Toolkit. Journal of Machine Learning Research 10 (2009), 1755–1758.
[21]
Diederik P Kingma and Jimmy Ba. 2014. Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.6980 (2014).
[22]
Zhuang Liu, Hanzi Mao, Chao-Yuan Wu, Christoph Feichtenhofer, Trevor Darrell, and Saining Xie. 2022. A convnet for the 2020s. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 11976–11986.
[23]
Miguel Angel Lopez-Gordo, Daniel Sanchez-Morillo, and F Pelayo Valle. 2014. Dry EEG electrodes. Sensors 14, 7 (2014), 12847–12870.
[24]
Radosław Mantiuk, Michał Kowalik, Adam Nowosielski, and Bartosz Bazyluk. 2012. Do-it-yourself eye tracker: Low-cost pupil-based eye tracker for computer graphics applications. In International Conference on Multimedia Modeling. Springer, 115–125.
[25]
Loïc Massin, Vincent Nourrit, Cyril Lahuec, Fabrice Seguin, Laure Adam, Emmanuel Daniel, 2020. Development of a new scleral contact lens with encapsulated photodetectors for eye tracking. Optics Express 28, 19 (2020), 28635–28647.
[26]
Kyle E Mathewson, Tyler JL Harrison, and Sayeed AD Kizuk. 2017. High and dry? Comparing active dry EEG electrodes to active and passive wet electrodes. Psychophysiology 54, 1 (2017), 74–82.
[27]
Francisco Muñoz-Leiva, Janet Hernández-Méndez, and Diego Gómez-Carmona. 2019. Measuring advertising effectiveness in Travel 2.0 websites through eye-tracking technology. Physiology & behavior 200 (2019), 83–95.
[28]
Sebastian Nagel and Martin Spüler. 2019. World’s fastest brain-computer interface: combining EEG2Code with deep learning. PloS one 14, 9 (2019), e0221909.
[29]
Adam Paszke, Sam Gross, Francisco Massa, Adam Lerer, James Bradbury, Gregory Chanan, Trevor Killeen, Zeming Lin, Natalia Gimelshein, Luca Antiga, Alban Desmaison, Andreas Kopf, Edward Yang, Zachary DeVito, Martin Raison, Alykhan Tejani, Sasank Chilamkurthy, Benoit Steiner, Lu Fang, Junjie Bai, and Soumith Chintala. 2019. PyTorch: An Imperative Style, High-Performance Deep Learning Library. In Advances in Neural Information Processing Systems 32. Curran Associates, Inc., 8024–8035.
[30]
Nerijus Ramanauskas. 2006. Calibration of video-oculographical eye-tracking system. Elektronika Ir Elektrotechnika 72, 8 (2006), 65–68.
[31]
Thiago Santini, Wolfgang Fuhl, David Geisler, and Enkelejda Kasneci. 2017. EyeRecToo: Open-source Software for Real-time Pervasive Head-mounted Eye Tracking. In VISIGRAPP (6: VISAPP). 96–101.
[32]
Sergei L Shishkin, Yuri O Nuzhdin, Evgeny P Svirin, Alexander G Trofimov, Anastasia A Fedorova, Bogdan L Kozyrskiy, and Boris M Velichkovsky. 2016. EEG negativity in fixations used for gaze-based control: Toward converting intentions into actions with an eye-brain-computer interface. Frontiers in neuroscience 10 (2016), 528.
[33]
Veronica Sundstedt. 2012. Gazing at games: An introduction to eye tracking control. Synthesis Lectures on Computer Graphics and Animation 5, 1 (2012), 1–113.
[34]
Lech Świrski, Andreas Bulling, and Neil Dodgson. 2012. Robust real-time pupil tracking in highly off-axis images. In Proceedings of the symposium on eye tracking research and applications. 173–176.
[35]
Lech Swirski and Neil Dodgson. 2013. A fully-automatic, temporal approach to single camera, glint-free 3D eye model fitting. Proc. PETMEI (2013), 1–11.
[36]
Marc Tonsen, Chris Kay Baumann, and Kai Dierkes. 2020. A high-level description and performance evaluation of pupil invisible. arXiv preprint arXiv:2009.00508 (2020).
[37]
Niilo V Valtakari, Ignace TC Hooge, Charlotte Viktorsson, Pär Nyström, Terje Falck-Ytter, and Roy S Hessels. 2021. Eye tracking in human interaction: Possibilities and limitations. Behavior Research Methods 53, 4 (2021), 1592–1608.
[38]
Sofie Vettori, Stephanie Van der Donck, Jannes Nys, Pieter Moors, Tim Van Wesemael, Jean Steyaert, Bruno Rossion, Milena Dzhelyova, and Bart Boets. 2020. Combined frequency-tagging EEG and eye-tracking measures provide no support for the “excess mouth/diminished eye attention” hypothesis in autism. Molecular autism 11, 1 (2020), 1–22.
[39]
David Vetturi, Michela Tiboni, Giulio Maternini, and Michela Bonera. 2020. Use of eye tracking device to evaluate the driver’s behaviour and the infrastructures quality in relation to road safety. Transportation research procedia 45 (2020), 587–595.
[40]
Arantxa Villanueva, Rafael Cabeza, and Sonia Porta. 2004. Eye tracking system model with easy calibration. In Proceedings of the 2004 symposium on Eye tracking research & applications. 55–55.
[41]
David R Walton, Rafael Kuffner Dos Anjos, Sebastian Friston, David Swapp, Kaan Akşit, Anthony Steed, and Tobias Ritschel. 2021. Beyond blur: Real-time ventral metamers for foveated rendering. ACM Transactions on Graphics 40, 4 (2021), 1–14.
[42]
Nutthanan Wanluk, Sarinporn Visitsattapongse, Aniwat Juhong, and C Pintavirooj. 2016. Smart wheelchair based on eye tracking. In 2016 9th Biomedical Engineering International Conference (BMEiCON). IEEE, 1–4.
[43]
Eric Whitmire, Laura Trutoiu, Robert Cavin, David Perek, Brian Scally, James Phillips, and Shwetak Patel. 2016. EyeContact: scleral coil eye tracking for virtual reality. In Proceedings of the 2016 ACM International Symposium on Wearable Computers. 184–191.
[44]
Erroll Wood, Tadas Baltrušaitis, Louis-Philippe Morency, Peter Robinson, and Andreas Bulling. 2016. Learning an appearance-based gaze estimator from one million synthesised images. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications. 131–138.
[45]
Dongrui Wu, Jung-Tai King, Chun-Hsiang Chuang, Chin-Teng Lin, and Tzyy-Ping Jung. 2017. Spatial filtering for EEG-based regression problems in brain–computer interface (BCI). IEEE Transactions on Fuzzy Systems 26, 2 (2017), 771–781.
[46]
Yanyu Xu, Yanbing Dong, Junru Wu, Zhengzhong Sun, Zhiru Shi, Jingyi Yu, and Shenghua Gao. 2018. Gaze prediction in dynamic 360 immersive videos. In proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 5333–5342.
[47]
Xucong Zhang, Yusuke Sugano, Mario Fritz, and Andreas Bulling. 2015. Appearance-based gaze estimation in the wild. In Proceedings of the IEEE conference on computer vision and pattern recognition. 4511–4520.

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
ETRA '23: Proceedings of the 2023 Symposium on Eye Tracking Research and Applications
May 2023
441 pages
ISBN:9798400701504
DOI:10.1145/3588015
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 30 May 2023

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. EEG
  2. EEG to gaze
  3. deep learning
  4. eye tracking
  5. gaze estimation
  6. machine learning

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Data Availability

Conference

ETRA '23

Acceptance Rates

Overall Acceptance Rate 69 of 137 submissions, 50%

Upcoming Conference

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 155
    Total Downloads
  • Downloads (Last 12 months)102
  • Downloads (Last 6 weeks)4
Reflects downloads up to 22 Nov 2024

Other Metrics

Citations

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media