Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/2733373.2806265acmconferencesArticle/Chapter ViewAbstractPublication PagesmmConference Proceedingsconference-collections
research-article

An Affordable Solution for Binocular Eye Tracking and Calibration in Head-mounted Displays

Published: 13 October 2015 Publication History

Abstract

Immersion is the ultimate goal of head-mounted displays (HMD) for Virtual Reality (VR) in order to produce a convincing user experience. Two important aspects in this context are motion sickness, often due to imprecise calibration, and the integration of a reliable eye tracking. We propose an affordable hard- and software solution for drift-free eye-tracking and user-friendly lens calibration within an HMD. The use of dichroic mirrors leads to a lean design that provides the full field-of-view (FOV) while using commodity cameras for eye tracking. Our prototype supports personalizable lens positioning to accommodate for different interocular distances. On the software side, a model-based calibration procedure adjusts the eye tracking system and gaze estimation to varying lens positions. Challenges such as partial occlusions due to the lens holders and eye lids are handled by a novel robust monocular pupil-tracking approach. We present four applications of our work: Gaze map estimation, foveated rendering for depth of field, gaze-contingent level-of-detail, and gaze control of virtual avatars.

References

[1]
Agisoft PhotoScan, 2014. agisoft.com, vis. 12-20-2014.
[2]
metaio SDK, 2014. dev.metaio.com/sdk, vis. 1-5-2015.
[3]
Arduino microprocessor, 2015. arduino.cc, vis. 03-12-2015.
[4]
Epic Games, Inc., Unreal Engine, 2015. unrealengine.com, vis. 03-12-2015.
[5]
Next Limit S.L., Maxwell Render, 2015. maxwellrender.com, vis. 03-12-2015.
[6]
Oculus VR Oculus Rift, 2015. oculus.com, vis. 03-12-2015.
[7]
OpenCV Library, 2015. opencv.org, vis. 03-12-2015.
[8]
M. Abrash. What VR could, should, and almost certainly will be within two years. Steam Dev Days, Seattle, 2014.
[9]
F. H. Adler, P. L. Kaufman, L. A. Levin, and A. Alm. Adler's Physiology of the Eye. Elsevier Health Sciences, 2011.
[10]
S. Baluja and D. Pomerleau. Non-intrusive gaze tracking using artificial neural networks. In Machine Learning in Computer Vision: What, Why and How?, 1993.
[11]
P. Bazanov and T. Järvenpää. Gaze estimation for near-eye display based on fusion of starburst algorithm and fern natural features. In FRUCT 2011, pages 1--8, 2011.
[12]
M. Bertalmio, A. L. Bertozzi, and G. Sapiro. Navier-stokes, fluid dynamics, and image and video inpainting. In CVPR 2001. Proceedings, volume 1, pages I--355. IEEE, 2001.
[13]
J.-Y. Bouguet. Camera calibration toolbox for matlab, 2010. vision.caltech.edu/bouguetj, vis. 08--10--2014.
[14]
D. A. Bowman, E. Kruijff, J. J. LaViola Jr, and I. Poupyrev. 3DUI: theory and practice. Addison-Wesley, 2004.
[15]
S. Chen and J. Epps. Efficient and robust pupil size and blink estimation from near-field video sequences for human machine interaction. IEEE Transactions on Cybernetics, 44(12):2356--2367, Dec 2014.
[16]
T. Dera, G. Boning, S. Bardins, and E. Schneider. Low-latency video tracking of horizontal, vertical, and torsional eye movements as a basis for 3dof realtime motion control of a head-mounted camera. In SMC'06, Proceedings, volume 6, pages 5191--5196. IEEE, 2006.
[17]
J. E. Doble, D. L. Feinberg, M. S. Rosner, and A. J. Rosner. Identification of binocular vision dysfunction in traumatic brain injury patients and effects of individualized prismatic spectacle lenses. PM&R, 2(4):244--253, 2010.
[18]
A. Duchowski. Eye tracking methodology: Theory and practice, volume 373. Springer, 2007.
[19]
A. T. Duchowski and A. Çöltekin. Foveated gaze-contingent displays for peripheral lod management, 3d visualization, and stereo imaging. TOMCCAP'07, 3(4):6, 2007.
[20]
A. T. Duchowski, V. Shivashankaraiah, T. Rawls, A. K. Gramopadhye, B. J. Melloy, and B. Kanki. Binocular eye tracking in virtual reality for inspection training. In ETRA'00, Proceedings, pages 89--96. ACM, 2000.
[21]
H. Durrant-Whyte and T. Bailey. Simultaneous localization and mapping: part i. Robotics & Automation Magazine, IEEE, 13(2):99--110, 2006.
[22]
A. W. Fitzgibbon, R. B. Fisher, et al. A buyer's guide to conic fitting. DAI, 1996.
[23]
D. W. Hansen and Q. Ji. In the eye of the beholder: A survey of models for eyes and gaze. PAMI'10, 32(3):478--500, 2010.
[24]
R. Hartley and A. Zisserman. Multiple View Geometry in Computer Vision. Cambridge University Press, 2003.
[25]
S. Hillaire, A. Lécuyer, R. Cozot, and G. Casiez. Using an eye-tracking system to improve camera motions and depth-of-field blur effects in virtual environments. In VR'08, Proceedings, pages 47--50. IEEE, 2008.
[26]
K. Holmqvist, M. Nyström, R. Andersson, R. Dewhurst, H. Jarodzka, and J. Van de Weijer. Eye tracking: A comprehensive guide to methods and measures. Oxford University Press, 2011.
[27]
K. Holmqvist, M. Nyström, and F. Mulvey. Eye tracker data quality: what it is and how to measure it. In ETRA'12, Proceedings, pages 45--52. ACM, 2012.
[28]
F. Klefenz, P. Husar, D. Krenzer, and A. Hess. Real-time calibration-free autonomous eye tracker. In ICASSP'10, Proceedings, pages 762--765. IEEE, 2010.
[29]
S. Kohlbecher, S. Bardinst, K. Bartl, E. Schneider, T. Poitschke, and M. Ablassmeier. Calibration-free eye tracking by reconstruction of the pupil ellipse in 3d space. In ETRA'08, Proceedings, pages 135--138. ACM, 2008.
[30]
N. Kumar, S. Kohlbecher, and E. Schneider. A novel approach to video-based pupil tracking. In SMC'09, Proceedings, pages 1255--1262. IEEE, 2009.
[31]
D. Li, D. Winfield, and D. J. Parkhurst. Starburst: A hybrid algorithm for video-based eye tracking combining feature-based and model-based approaches. In CVPR 2005, Proceedings of, pages 79--79. IEEE, 2005.
[32]
M. Mauderer, S. Conte, M. A. Nacenta, and D. Vishwanath. Depth perception with gaze-contingent depth of field. In CHI'14, Proceedings, pages 217--226. ACM, 2014.
[33]
F. Moisy. Ezyfit: a free curve fitting toolbox for matlab. U. Paris Sud. Version 2, 2011.
[34]
K. Pfeuffer, M. Vidal, J. Turner, A. Bulling, and H. Gellersen. Pursuit calibration: Making gaze calibration less tedious and more flexible. In UIS'13, Proceedings, pages 261--270. ACM, 2013.
[35]
E. M. Reingold, L. C. Loschky, G. W. McConkie, and D. M. Stampe. Gaze-contingent multiresolutional displays: An integrative review. HFES Journal, 45(2):307--328, 2003.
[36]
A. Schall. Eye tracking in user experience design. Morgan Kaufmann, 2014.
[37]
E. Schneider, T. Villgrattner, J. Vockeroth, K. Bartl, S. Kohlbecher, S. Bardins, H. Ulbrich, and T. Brandt. Eyeseecam: An eye movement--driven head camera for the examination of natural visual exploration. Annals of the NY Academy of Science, 1164(1):461--467, 2009.
[38]
K.-H. Tan, D. Kriegman, and N. Ahuja. Appearance-based eye gaze estimation. In WACV 2002, Proceedings, pages 191--195, 2002.
[39]
C. Topala and C. Akinlara. An adaptive algorithm for precise pupil boundary detection using the entropy of contour gradients. 2013. Elsevier preprint.
[40]
O. Williams, A. Blake, and R. Cipolla. Sparse and semi-supervised visual mapping with the S3GP. In CVPR'06, Proceedings, volume 1, pages 230--237, 2006.

Cited By

View all
  • (2024)Improving Eye-Tracking Data Quality: A Framework for Reproducible Evaluation of Detection AlgorithmsSensors10.3390/s2409268824:9(2688)Online publication date: 24-Apr-2024
  • (2024)Real-Time Gaze Tracking via Head-Eye Cues on Head Mounted DevicesIEEE Transactions on Mobile Computing10.1109/TMC.2024.342592823:12(13292-13309)Online publication date: Dec-2024
  • (2024)An Optical Design for Interaction With Mid-Air Images Using the Shape of Real ObjectsIEEE Access10.1109/ACCESS.2024.337478212(39129-39138)Online publication date: 2024
  • Show More Cited By

Index Terms

  1. An Affordable Solution for Binocular Eye Tracking and Calibration in Head-mounted Displays

      Recommendations

      Comments

      Please enable JavaScript to view thecomments powered by Disqus.

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      MM '15: Proceedings of the 23rd ACM international conference on Multimedia
      October 2015
      1402 pages
      ISBN:9781450334594
      DOI:10.1145/2733373
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 13 October 2015

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. eye tracking
      2. gaze
      3. head-mounted display
      4. mobile
      5. virtual reality
      6. wearable

      Qualifiers

      • Research-article

      Funding Sources

      • European Union's Seventh Framework Programme FP7/2007-2013. Reality CG.

      Conference

      MM '15
      Sponsor:
      MM '15: ACM Multimedia Conference
      October 26 - 30, 2015
      Brisbane, Australia

      Acceptance Rates

      MM '15 Paper Acceptance Rate 56 of 252 submissions, 22%;
      Overall Acceptance Rate 2,145 of 8,556 submissions, 25%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)60
      • Downloads (Last 6 weeks)5
      Reflects downloads up to 27 Nov 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)Improving Eye-Tracking Data Quality: A Framework for Reproducible Evaluation of Detection AlgorithmsSensors10.3390/s2409268824:9(2688)Online publication date: 24-Apr-2024
      • (2024)Real-Time Gaze Tracking via Head-Eye Cues on Head Mounted DevicesIEEE Transactions on Mobile Computing10.1109/TMC.2024.342592823:12(13292-13309)Online publication date: Dec-2024
      • (2024)An Optical Design for Interaction With Mid-Air Images Using the Shape of Real ObjectsIEEE Access10.1109/ACCESS.2024.337478212(39129-39138)Online publication date: 2024
      • (2024)Individualized foveated rendering with eye-tracking head-mounted displayVirtual Reality10.1007/s10055-023-00931-828:1Online publication date: 19-Jan-2024
      • (2022)A Perspective Review on Integrating VR/AR with Haptics into STEM Education for Multi-Sensory LearningRobotics10.3390/robotics1102004111:2(41)Online publication date: 31-Mar-2022
      • (2022)A Novel Integrated Eye-Tracking System with Stereo Stimuli for 3D Gaze EstimationIEEE Transactions on Instrumentation and Measurement10.1109/TIM.2022.3225009(1-1)Online publication date: 2022
      • (2022)A dataset of eye gaze images for calibration-free eye tracking augmented reality headsetScientific Data10.1038/s41597-022-01200-09:1Online publication date: 29-Mar-2022
      • (2020)An Extended Method for Saccadic Eye Movement Measurements Using a Head-Mounted DisplayHealthcare10.3390/healthcare80201048:2(104)Online publication date: 21-Apr-2020
      • (2019)A Survey on 360° Video StreamingACM Computing Surveys10.1145/332911952:4(1-36)Online publication date: 30-Aug-2019
      • (2019)Perceptual rasterization for head-mounted display image synthesisACM Transactions on Graphics10.1145/3306346.332303338:4(1-14)Online publication date: 12-Jul-2019
      • Show More Cited By

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media