Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/2857491.2857532acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
research-article

EyeSee3D 2.0: model-based real-time analysis of mobile eye-tracking in static and dynamic three-dimensional scenes

Published: 14 March 2016 Publication History

Abstract

With the launch of ultra-portable systems, mobile eye tracking finally has the potential to become mainstream. While eye movements on their own can already be used to identify human activities, such as reading or walking, linking eye movements to objects in the environment provides even deeper insights into human cognitive processing.
We present a model-based approach for the identification of fixated objects in three-dimensional environments. For evaluation, we compare the automatic labelling of fixations with those performed by human annotators. In addition to that, we show how the approach can be extended to support moving targets, such as individual limbs or faces of human interaction partners. The approach also scales to studies using multiple mobile eye-tracking systems in parallel.
The developed system supports real-time attentive systems that make use of eye tracking as means for indirect or direct human-computer interaction as well as off-line analysis for basic research purposes and usability studies.

References

[1]
Advanced Realtime Tracking GmbH, 2015. ART Advanced Realtime Tracking Company Website. WWW: http://www.artracking.com/home/, last checked October 2015.
[2]
Applied Science Laboratories, 2015. ASL Company Website. WWW: http://www.asleyetracking.com/Site/, last checked October 2015.
[3]
Babcock, J. S., and Pelz, J. B. 2004. Building a lightweight eyetracking headgear. In ACM ETRA 2004, ACM, 109--114.
[4]
Brône, G., Oben, B., and Goedemé, T. 2011. Towards a more effective method for analyzing mobile eye-tracking data: integrating gaze data with object recognition algorithms. In Proceedings of the 1st international workshop on pervasive eye tracking & mobile eye-based interaction, ACM, New York, NY, USA, PETMEI '11, 53--56.
[5]
Bulling, A., Roggen, D., and Trster, G. 2008. It's in your eyes - towards context-awareness and mobile hci using wearable eog goggles. In Proc. of the 10th International Conference on Ubiquitous Computing (UbiComp 2008), 84--93.
[6]
Bulling, A., Ward, J. A., Gellersen, H., and Trster, G. 2008. Robust recognition of reading activity in transit using wearable electrooculography. In Pervasive Computing, J. Indulska, D. Patterson, T. Rodden, and M. Ott, Eds., vol. 5013 of Lecture Notes in Computer Science. Springer Berlin Heidelberg, 19--37.
[7]
Bulling, A., Ward, J., Gellersen, H., and Troster, G. 2011. Eye movement analysis for activity recognition using electrooculography. Pattern Analysis and Machine Intelligence, IEEE Transactions on 33, 4 (April), 741--753.
[8]
Hammer, J. H., Maurus, M., and Beyerer, J. 2013. Realtime 3d gaze analysis in mobile applications. In Proceedings of the 2013 Conference on Eye Tracking South Africa, ACM, New York, NY, USA, ETSA '13, 75--78. 00001.
[9]
Harmening, K., and Pfeiffer, T. 2013. Location-based online identification of objects in the centre of visual attention using eye tracking. In Proceedings of the First International Workshop on Solutions for Automatic Gaze-Data Analysis 2013, Center of Excellence Cognitive Interaction Technology, Bielefeld, Germany, 38--40.
[10]
Ishimaru, S., Uema, Y., Kunze, K., Kise, K., Tanaka, K., and Inami, M. 2014. Smarter eyewear: using commercial eog glasses for activity recognition. In Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication, ACM, 239--242.
[11]
Kassner, M. P., and Patera, W. R. 2012. PUPIL: constructing the space of visual attention. PhD thesis, Massachusetts Institute of Technology.
[12]
Landis, J. R., and Koch, G. G. 1977. The Measurement of Observer Agreement for Categorical Data. Biometrics 33, 1 (Mar.), 159. 30945.
[13]
Li, D., Babcock, J., and Parkhurst, D. 2006. openEyes: a low-cost head-mounted eye-tracking solution. In ACM ETRA 2006, ACM, 95--100.
[14]
Maurus, M., Hammer, J. H., and Beyerer, J. 2014. Realistic heatmap visualization for interactive analysis of 3d gaze data. In Proceedings of the Symposium on Eye Tracking Research and Applications, ACM, New York, NY, USA, ETRA '14, 295--298. 00000.
[15]
Microsoft, 2015. Kinect for Windows Website. WWW: https://dev.windows.com/en-us/kinect, last checked October 2015.
[16]
NaturalPoint, Inc., 2015. OptiTrack Motion Capture Systems Company Website. WWW: http://www.optitrack.com/, last checked October 2015.
[17]
Paletta, L., Santner, K., Fritz, G., Mayer, H., and Schrammel, J. 2013. 3d attention: measurement of visual saliency using eye tracking glasses. In CHI '13 Extended Abstracts on Human Factors in Computing Systems, ACM, 199--204.
[18]
Pelz, J. B. 2011. Semantic analysis of mobile eyetracking data. In Proceedings of the 1st international workshop on pervasive eye tracking & mobile eye-based interaction, ACM, New York, NY, USA, PETMEI '11, 1--2.
[19]
Pfeiffer, T., and Renner, P. 2014. Eyesee3d: a low-cost approach for analysing mobile 3d eye tracking data using augmented reality technology. ACM, Proceedings of the Symposium on Eye Tracking Research and Applications, 195--202.
[20]
Pfeiffer, T., Renner, P., and Pfeiffer-Lessmann, N., 2016. Companion website to this paper: http://etra2016eyesee3d.eyemovementresearch.com/.
[21]
Pirri, F., Pizzoli., M., Rigato, D., and Shabani, R. 2011. 3d saliency maps. In Computer Vision and Pattern Recognition Workshops (CVPRW), IEEE, 9--14.
[22]
SensoMotoric Instruments GmbH, 2015. SensoMotoric Instruments GmbH Company Website. WWW: http://www.smivision.com/en.html, last checked October 2015.
[23]
Smith, R., Self, M., and Cheeseman, P. 1990. Estimating uncertain spatial relationships in robotics. In Autonomous robot vehicles. Springer, 167--193.
[24]
The Blender Foundation, 2015. Blender. WWW: http://www.blender.org/, last checked October 2015, October.
[25]
Tobii AB, 2015. Tobii. WWW: http://www.tobii.com/, last checked October 2015.
[26]
Toyama, T., Kieninger, T., Shafait, F., and Dengel, A. 2012. Gaze guided object recognition using a head-mounted eye tracker. In ACM ETRA 2012, ACM, 91--98.
[27]
Vicon Motion Systems Ltd., 2015. VICON Company Website. WWW: http://www.vicon.com/, last checked October 2015.
[28]
Wittenburg, P., Brugman, H., Russel, A., Klassmann, A., and Sloetjes, H. 2006. Elan: a professional framework for multimodality research. In Proceedings of LREC, vol. 2006, 5th.

Cited By

View all
  • (2025)The fundamentals of eye tracking part 4: Tools for conducting an eye tracking studyBehavior Research Methods10.3758/s13428-024-02529-757:1Online publication date: 6-Jan-2025
  • (2023)Interactive Fixation-to-AOI Mapping for Mobile Eye Tracking Data based on Few-Shot Image ClassificationCompanion Proceedings of the 28th International Conference on Intelligent User Interfaces10.1145/3581754.3584179(175-178)Online publication date: 27-Mar-2023
  • (2023)IMETA: An Interactive Mobile Eye Tracking Annotation Method for Semi-automatic Fixation-to-AOI mappingCompanion Proceedings of the 28th International Conference on Intelligent User Interfaces10.1145/3581754.3584125(33-36)Online publication date: 27-Mar-2023
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
ETRA '16: Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications
March 2016
378 pages
ISBN:9781450341257
DOI:10.1145/2857491
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 14 March 2016

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. 3D
  2. augmented reality
  3. eye tracking
  4. gaze analysis
  5. joint attention
  6. mobile eye tracking
  7. social interaction

Qualifiers

  • Research-article

Funding Sources

  • German Research Foundation (DFG)
  • BMBF

Conference

ETRA '16
ETRA '16: 2016 Symposium on Eye Tracking Research and Applications
March 14 - 17, 2016
South Carolina, Charleston

Acceptance Rates

Overall Acceptance Rate 69 of 137 submissions, 50%

Upcoming Conference

ETRA '25

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)24
  • Downloads (Last 6 weeks)7
Reflects downloads up to 14 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2025)The fundamentals of eye tracking part 4: Tools for conducting an eye tracking studyBehavior Research Methods10.3758/s13428-024-02529-757:1Online publication date: 6-Jan-2025
  • (2023)Interactive Fixation-to-AOI Mapping for Mobile Eye Tracking Data based on Few-Shot Image ClassificationCompanion Proceedings of the 28th International Conference on Intelligent User Interfaces10.1145/3581754.3584179(175-178)Online publication date: 27-Mar-2023
  • (2023)IMETA: An Interactive Mobile Eye Tracking Annotation Method for Semi-automatic Fixation-to-AOI mappingCompanion Proceedings of the 28th International Conference on Intelligent User Interfaces10.1145/3581754.3584125(33-36)Online publication date: 27-Mar-2023
  • (2022)A Systematic Review of Visualization Techniques and Analysis Tools for Eye-Tracking in 3D EnvironmentsFrontiers in Neuroergonomics10.3389/fnrgo.2022.9100193Online publication date: 13-Jul-2022
  • (2022)Using Fiducial Marker for Analyzing Wearable Eye-Tracker Gaze Data Measured While CookingHCI International 2022 - Late Breaking Papers. Multimodality in Advanced Interaction Environments10.1007/978-3-031-17618-0_15(192-204)Online publication date: 2-Oct-2022
  • (2021)Automatic Visual Attention Detection for Mobile Eye Tracking Using Pre-Trained Computer Vision Models and Human GazeSensors10.3390/s2112414321:12(4143)Online publication date: 16-Jun-2021
  • (2021)Image-Based Projection Labeling for Mobile Eye TrackingACM Symposium on Eye Tracking Research and Applications10.1145/3448017.3457382(1-12)Online publication date: 25-May-2021
  • (2021)Neural Networks for Semantic Gaze Analysis in XR SettingsACM Symposium on Eye Tracking Research and Applications10.1145/3448017.3457380(1-11)Online publication date: 25-May-2021
  • (2021)4D Attention: Comprehensive Framework for Spatio-Temporal Gaze MappingIEEE Robotics and Automation Letters10.1109/LRA.2021.30972746:4(7240-7247)Online publication date: Oct-2021
  • (2021)Inferring Goals with Gaze during Teleoperated Manipulation2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)10.1109/IROS51168.2021.9636551(7307-7314)Online publication date: 27-Sep-2021
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media