Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3266037.3266119acmconferencesArticle/Chapter ViewAbstractPublication PagesuistConference Proceedingsconference-collections
poster

Towards a Symbiotic Human-Machine Depth Sensor: Exploring 3D Gaze for Object Reconstruction

Published: 11 October 2018 Publication History

Abstract

Eye tracking is expected to become an integral part of future augmented reality (AR) head-mounted displays (HMDs) given that it can easily be integrated into existing hardware and provides a versatile interaction modality. To augment objects in the real world, AR HMDs require a three-dimensional understanding of the scene, which is currently solved using depth cameras. In this work we aim to explore how 3D gaze data can be used to enhance scene understanding for AR HMDs by envisioning a symbiotic human-machine depth camera, fusing depth data with 3D gaze information. We present a first proof of concept, exploring to what extend we are able to recognise what a user is looking at by plotting 3D gaze data. To measure 3D gaze, we implemented a vergence-based algorithm and built an eye tracking setup consisting of a Pupil Labs headset and an OptiTrack motion capture system, allowing us to measure 3D gaze inside a 50x50x50 cm volume. We show first 3D gaze plots of "gazed-at" objects and describe our vision of a symbiotic human-machine depth camera that combines a depth camera and human 3D gaze information.

References

[1]
A. Bulling, J. A. Ward, H. Gellersen, and G. Troster. 2011. Eye Movement Analysis for Activity Recognition Using Electrooculography. IEEE Transactions on Pattern Analysis and Machine Intelligence 33, 4 (April 2011), 741--753.
[2]
Andrew T Duchowski. 2007. Eye tracking methodology. Theory and practice 328 (2007).
[3]
Carlos Elmadjian, Pushkar Shukla, Antonio Diaz Tula, and Carlos H Morimoto. 2018. 3D gaze estimation in the scene volume with a head-mounted eye tracker. In Proceedings of the Workshop on Communication by Gaze Interaction. ACM, 3.
[4]
Moritz Kassner, William Patera, and Andreas Bulling. 2014. Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction. In Adj. Proc. of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp). 1151--1160.
[5]
Dominik Kirst and Andreas Bulling. 2016. On the Verge: Voluntary Convergences for Accurate and Precise Timing of Gaze Input. In Ext. Abstr. of the 34th ACM SIGCHI Conference on Human Factors in Computing Systems (CHI). 1519--1525.
[6]
Mikko Kytö, Barrett Ens, Thammathip Piumsomboon, Gun A Lee, and Mark Billinghurst. 2018. Pinpointing: Precise Head-and Eye-Based Target Selection for Augmented Reality. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. ACM, 81.
[7]
Ji Woo Lee, Chul Woo Cho, Kwang Yong Shin, Eui Chul Lee, and Kang Ryoung Park. 2012. 3D gaze tracking method using Purkinje images on eye optical model and pupil. Optics and Lasers in Engineering 50, 5 (2012), 736--751.
[8]
Teesid Leelasawassuk and Walterio W. Mayol-Cuevas. 2013. 3D from Looking: Using Wearable Gaze Tracking for Hands-free and Feedback-free Object Modelling. In Proceedings of the 2013 International Symposium on Wearable Computers (ISWC '13). ACM, New York, NY, USA, 105--112.
[9]
Esteban Gutierrez Mlot, Hamed Bahmani, Siegfried Wahl, and Enkelejda Kasneci. 2016. 3D Gaze Estimation using Eye Vergence. In HEALTHINF. 125--131.
[10]
Thies Pfeiffer and Patrick Renner. 2014. EyeSee3D: a low-cost approach for analyzing mobile 3D eye tracking data using computer vision and augmented reality technology. In Proceedings of the Symposium on Eye Tracking Research and Applications. ACM, 369--376.
[11]
Ken Pfeuffer, Benedikt Mayer, Diako Mardanbegi, and Hans Gellersen. 2017. Gaze+ pinch interaction in virtual reality. In Proceedings of the 5th Symposium on Spatial User Interaction. ACM, 99--108.
[12]
Mélodie Vidal, David H Nguyen, and Kent Lyons. 2014. Looking at or through?: using eye tracking to infer attention location for wearable transparent displays. In Proceedings of the 2014 ACM International Symposium on Wearable Computers. 87--90.
[13]
Rui I Wang, Brandon Pelfrey, Andrew T Duchowski, and Donald H House. 2014. Online 3D gaze localization on stereoscopic displays. ACM Transactions on Applied Perception (TAP) 11, 1 (2014), 3.

Cited By

View all
  • (2020)A Design Space for External Communication of Autonomous Vehicles12th International Conference on Automotive User Interfaces and Interactive Vehicular Applications10.1145/3409120.3410646(212-222)Online publication date: 21-Sep-2020
  • (2020)Towards a Design Space for External Communication of Autonomous VehiclesExtended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems10.1145/3334480.3382844(1-8)Online publication date: 25-Apr-2020
  • (2020)Eye-tracking for human-centered mixed reality: promises and challengesOptical Architectures for Displays and Sensing in Augmented, Virtual, and Mixed Reality (AR, VR, MR)10.1117/12.2542699(27)Online publication date: 13-Mar-2020

Index Terms

  1. Towards a Symbiotic Human-Machine Depth Sensor: Exploring 3D Gaze for Object Reconstruction

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    UIST '18 Adjunct: Adjunct Proceedings of the 31st Annual ACM Symposium on User Interface Software and Technology
    October 2018
    251 pages
    ISBN:9781450359498
    DOI:10.1145/3266037
    Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 11 October 2018

    Check for updates

    Author Tags

    1. 3d gaze
    2. eye-based interaction
    3. human-machine symbiosis

    Qualifiers

    • Poster

    Conference

    UIST '18

    Acceptance Rates

    UIST '18 Adjunct Paper Acceptance Rate 80 of 375 submissions, 21%;
    Overall Acceptance Rate 355 of 1,733 submissions, 20%

    Upcoming Conference

    UIST '25
    The 38th Annual ACM Symposium on User Interface Software and Technology
    September 28 - October 1, 2025
    Busan , Republic of Korea

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)14
    • Downloads (Last 6 weeks)1
    Reflects downloads up to 14 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2020)A Design Space for External Communication of Autonomous Vehicles12th International Conference on Automotive User Interfaces and Interactive Vehicular Applications10.1145/3409120.3410646(212-222)Online publication date: 21-Sep-2020
    • (2020)Towards a Design Space for External Communication of Autonomous VehiclesExtended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems10.1145/3334480.3382844(1-8)Online publication date: 25-Apr-2020
    • (2020)Eye-tracking for human-centered mixed reality: promises and challengesOptical Architectures for Displays and Sensing in Augmented, Virtual, and Mixed Reality (AR, VR, MR)10.1117/12.2542699(27)Online publication date: 13-Mar-2020

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media