Nothing Special   »   [go: up one dir, main page]

skip to main content
research-article

PuRe: : Robust pupil detection for real-time pervasive eye tracking

Published: 01 May 2018 Publication History

Highlights

A novel computer-vision based algorithm for robust pupil detection is introduced.
The algorithm increases detection rate up to ten percentage points in challenging data.
Additional evaluation metrics for pupil detection algorithms are introduced.
Specificity, precision, and sensitivity improved by 5.96%, 25.05%, and 10.94%.
The algorithm runs in real time for modern eye trackers (at 120 fps).

Graphical abstract

Display Omitted

Abstract

Real-time, accurate, and robust pupil detection is an essential prerequisite to enable pervasive eye-tracking and its applications – e.g., gaze-based human computer interaction, health monitoring, foveated rendering, and advanced driver assistance. However, automated pupil detection has proved to be an intricate task in real-world scenarios due to a large mixture of challenges such as quickly changing illumination and occlusions. In this paper, we introduce the Pupil Reconstructor (PuRe), a method for pupil detection in pervasive scenarios based on a novel edge segment selection and conditional segment combination schemes; the method also includes a confidence measure for the detected pupil. The proposed method was evaluated on over 316,000 images acquired with four distinct head-mounted eye tracking devices. Results show a pupil detection rate improvement of over 10 percentage points w.r.t. state-of-the-art algorithms in the two most challenging data sets (6.46 for all data sets), further pushing the envelope for pupil detection. Moreover, we advance the evaluation protocol of pupil detection algorithms by also considering eye images in which pupils are not present and contributing a new data set of mostly closed eyes images. In this aspect, PuRe improved precision and specificity w.r.t. state-of-the-art algorithms by 25.05 and 10.94 percentage points, respectively, demonstrating the meaningfulness of PuRe’s confidence measure. PuRe operates in real-time for modern eye trackers (at 120 fps) and is fully integrated into EyeRecToo – an open-source state-of-the-art software for pervasive head-mounted eye tracking. The proposed method and data set are available at http://www.ti.uni-tuebingen.de/perception.

References

[1]
R. Aronson, T. Santini, T. Kübler, E. Kasneci, S. Srinivasa, H. Admoni, Eye-hand behavior in human-robot shared manipulation, Proceedings of the 13th Annual ACM/IEEE International Conference on Human Robot Interaction (To appear), 2018.
[2]
R.W. Baloh, A.W. Sills, W.E. Kumley, V. Honrubia, Quantitative measurement of saccade amplitude, duration, and velocity, Neurology 25 (11) (1975).
[3]
F. Bashir, F. Porikli, Performance evaluation of object detection and tracking systems, Proceedings 9th IEEE International Workshop on PETS, 2006, pp. 7–14.
[4]
C. Braunagel, W. Rosenstiel, E. Kasneci, Ready for take-over? a new driver assistance system for an automated classification of driver take-over readiness, IEEE Intell. Transp. Syst. Mag. 9 (4) (2017) 10–22.
[5]
A. Bulling, H. Gellersen, Toward mobile eye-based human-computer interaction, IEEE Pervasive Comput. 9 (4) (2010) 8–12.
[6]
J. Canny, A computational approach to edge detection, IEEE Trans. Pattern Anal. Mach. Intell. (6) (1986) 679–698.
[7]
L. Čehovin, A. Leonardis, M. Kristan, Visual object tracking performance measures revisited, IEEE Trans. Image Process. 25 (3) (2016) 1261–1274.
[8]
Chetlur, S., Woolley, C., Vandermersch, P., Cohen, J., Tran, J., Catanzaro, B., Shelhamer, E., 2014. cudnn: Efficient primitives for deep learning. arXiv:1410.0759.
[9]
B.S. Chu, J.M. Wood, M.J. Collins, The effect of presbyopic vision corrections on nighttime driving performance, Invest. Ophthalmol. Visual Sci. 51 (9) (2010) 4861–4866.
[10]
G. Efland, S. Parikh, H. Sanghavi, A. Farooqui, High performance dsp for vision, imaging and neural networks, IEEE Hot Chips 28 (2016).
[11]
Ergoneers, 2017. Dikablis Glasses Professional. Accessed in 2017-07-26.
[12]
A.W. Fitzgibbon, R.B. Fisher, A buyer’s guide to conic fitting, Proceedings of the 6th British Conference on Machine Vision (Vol. 2), BMVA Press, Surrey, UK, UK, 1995, pp. 513–522.
[13]
T. Foulsham, E. Walker, A. Kingstone, The where, what and when of gaze allocation in the lab and the natural environment, Vision Res. 51 (17) (2011) 1920–1931.
[14]
M. Frigge, D.C. Hoaglin, B. Iglewicz, Some implementations of the boxplot, Am. Stat. 43 (1) (1989) 50–54.
[15]
W. Fuhl, D. Geisler, T. Santini, W. Rosenstiel, E. Kasneci, Evaluation of state-of-the-art pupil detection algorithms on remote eye images, Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct, ACM, 2016, pp. 1716–1725.
[16]
W. Fuhl, T. Kübler, K. Sippel, W. Rosenstiel, E. Kasneci, Excuse: robust pupil detection in real-world scenarios, International Conference on Computer Analysis of Images and Patterns, Springer, 2015, pp. 39–51.
[17]
Fuhl, W., Santini, T., Kasneci, G., Kasneci, E., 2016b. Pupilnet: convolutional neural networks for robust pupil detection. arXiv:1601.04902.
[18]
W. Fuhl, T.C. Santini, T. Kübler, E. Kasneci, Else: ellipse selection for robust pupil detection in real-world environments, Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications, ACM, 2016, pp. 123–130.
[19]
W. Fuhl, M. Tonsen, A. Bulling, E. Kasneci, Pupil detection for head-mounted eye tracking in the wild: an evaluation of the state of the art, Mach. Vis. Appl. 27 (8) (2016) 1275–1288.
[20]
B. Guenter, M. Finch, S. Drucker, D. Tan, J. Snyder, Foveated 3d graphics, ACM Trans. Graphics (TOG) 31 (6) (2012) 164.
[21]
D.W. Hansen, R.I. Hammoud, An improved likelihood model for eye tracking, Comput. Vis. Image Underst. 106 (2) (2007) 220–230.
[22]
D.W. Hansen, A.E. Pece, Eye tracking in the wild, Comput. Vis. Image Underst. 98 (1) (2005) 155–181.
[23]
S. Jansen, H. Kingma, R. Peeters, A confidence measure for real-time eye movement detection in video-oculography, 13th International Conference on Biomedical Engineering, Springer, 2009, pp. 335–339.
[24]
E. Kasneci, Towards the automated recognition of assistance need for drivers with impaired visual field. Ph.D. thesis, Universität Tübingen, Germany, 2013.
[25]
E. Kasneci, K. Sippel, M. Heister, K. Aehling, W. Rosenstiel, U. Schiefer, E. Papageorgiou, Homonymous visual field loss and its impact on visual exploration: a supermarket study, Transl. Vis . Sci. Technol. 3 (6) (2014).
[26]
M. Kristan, J. Matas, A. Leonardis, T. Vojíř, R. Pflugfelder, G. Fernandez, G. Nebehay, F. Porikli, L. Čehovin, A novel performance evaluation methodology for single-target trackers, IEEE Trans. Pattern Anal. Mach. Intell. 38 (11) (2016) 2137–2155.
[27]
T.C. Kübler, E. Kasneci, W. Rosenstiel, M. Heister, K. Aehling, K. Nagel, U. Schiefer, E. Papageorgiou, Driving with glaucoma: task performance and gaze movements, Optom. Vis. Sci. 92 (11) (2015) 1037–1046.
[28]
J. Kunjur, T. Sabesan, V. Ilankovan, Anthropometric analysis of eyebrows and eyelids: an inter-racial study, Br. J. Oral Maxillofac. Surg. 44 (2) (2006) 89–93.
[29]
X. Liu, F. Xu, K. Fujimura, Real-time eye detection and tracking for driver observation under various light conditions, Intelligent Vehicle Symposium, 2002. IEEE, 2, 2002.
[30]
Microsoft, 2017. Accessed in 2017-07-26.
[31]
G.J. Mohammed, B.R. Hong, A.A. Jarjes, Accurate pupil features extraction based on new projection function, Comput. Inf. 29 (4) (2012) 663–680.
[32]
C.H. Morimoto, M.R. Mimica, Eye gaze tracking techniques for interactive applications, Comput. Vis. Image Underst. 98 (1) (2005) 4–24.
[33]
Oculus, 2017. Accessed in 2017-07-26.
[34]
M. Pedrotti, S. Lei, J. Dzaack, M. Rötting, A data-driven algorithm for offline pupil signal preprocessing and eyeblink detection in low-speed eye-tracking protocols, Behav. Res. Methods 43 (2) (2011) 372–383.
[35]
C. Pheatt, Intel® threading building blocks, J. Comput. Sci. Coll. 23 (4) (2008).
[36]
Pupil Labs, 2017. Accessed in 2017-07-26.
[37]
Raffle, H. S., Wang, C.-J., 2015. Heads up display. US Patent 9,001,030.
[38]
T. Santini, W. Fuhl, D. Geisler, E. Kasneci, Eyerectoo: open-source software for real-time pervasive head-mounted eye-tracking, Proceedings of the 12th Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications, 2017.
[39]
T. Santini, W. Fuhl, E. Kasneci, Calibme: fast and unsupervised eye tracker calibration for gaze-based pervasive human-computer interaction, Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, ACM, 2017, pp. 2594–2605.
[40]
T. Santini, W. Fuhl, T. Kübler, E. Kasneci, Bayesian identification of fixations, saccades, and smooth pursuits, Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications, ACM, 2016, pp. 163–170.
[41]
J. Schmidt, R. Laarousi, W. Stolzmann, K. Karrer-Gauß, Eye blink detection for different driver states in conditionally automated driving and manual driving using eog and a driver camera, Behav. Res. Methods (2017) 1–14.
[42]
R. Spector, The pupils, in: Walker H.K, Hall W.D., H.J (Eds.), Clinical Methods: The HIstory, Physical, and Laboratory Examinations, Butterworths, 1990.
[43]
Y. Sugano, A. Bulling, Self-calibrating head-mounted eye trackers using egocentric visual saliency, Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology, ACM, 2015, pp. 363–372.
[44]
L. Świrski, A. Bulling, N. Dodgson, Robust real-time pupil tracking in highly off-axis images, Proceedings of the Symposium on Eye Tracking Research and Applications, ACM, 2012, pp. 173–176.
[45]
L. Świrski, N.A. Dodgson, A fully-automatic, temporal approach to single camera, glint-free 3d eye model fitting [abstract], Proceedings of ECEM 2013, 2013.
[46]
C.-H. Teh, R.T. Chin, On the detection of dominant points on digital curves, IEEE Trans. Pattern Anal. Mach. Intell. 11 (8) (1989) 859–872.
[47]
T. Tien, P.H. Pucher, M.H. Sodergren, K. Sriskandarajah, G.-Z. Yang, A. Darzi, Differences in gaze behaviour of expert and junior surgeons performing open inguinal hernia repair, Surg. Endosc. 29 (2) (2015) 405–413.
[48]
M. Tonsen, X. Zhang, Y. Sugano, A. Bulling, Labelled pupils in the wild: a dataset for studying pupil detection in unconstrained environments, Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications, ACM, 2016, pp. 139–142.
[49]
G.T. Toussaint, Solving geometric problems with the rotating calipers, Proc. IEEE Melecon, 83, 1983, p. A10.
[50]
S. Trösterer, A. Meschtscherjakov, D. Wilfinger, M. Tscheligi, Eye tracking in the car: challenges in a dual-task scenario on a test track, Proceedings of the 6th AutomotiveUI, ACM, 2014.
[51]
F. Vera-Olmos, N. Malpica, Deconvolutional neural network for pupil detection in real-world environments, International Work-Conference on the Interplay Between Natural and Artificial Computation, Springer, 2017, pp. 223–231.
[52]
M. Vidal, J. Turner, A. Bulling, H. Gellersen, Wearable eye tracking for mental health monitoring, Comput. Commun. 35 (11) (2012) 1306–1311.
[53]
H. Vrzakova, R. Bednarik, Hard lessons learned: mobile eye-tracking in cockpits, Proceedings of the 4th Workshop on Eye Gaze in Intelligent Human Machine Interaction, ACM, 2012, p. 7.
[54]
J.M. Wood, R.A. Tyrrell, P. Lacherez, A.A. Black, Night-time pedestrian conspicuity: effects of clothing on drivers eye movements, Ophthalmic Physiol. Opt. 37 (2) (2017) 184–190.
[55]
Z. Zhu, Q. Ji, Robust real-time eye detection and tracking under variable lighting conditions and various face orientations, Comput. Vis. Image Underst. 98 (1) (2005) 124–154.

Cited By

View all
  • (2024)Using Deep Learning to Increase Eye-Tracking Robustness, Accuracy, and Precision in Virtual RealityProceedings of the ACM on Computer Graphics and Interactive Techniques10.1145/36547057:2(1-16)Online publication date: 17-May-2024
  • (2024)Zero-Shot Segmentation of Eye Features Using the Segment Anything Model (SAM)Proceedings of the ACM on Computer Graphics and Interactive Techniques10.1145/36547047:2(1-16)Online publication date: 17-May-2024
  • (2024)CSA-CNN: A Contrastive Self-Attention Neural Network for Pupil Segmentation in Eye Gaze TrackingProceedings of the 2024 Symposium on Eye Tracking Research and Applications10.1145/3649902.3653351(1-7)Online publication date: 4-Jun-2024
  • Show More Cited By

Index Terms

  1. PuRe: Robust pupil detection for real-time pervasive eye tracking
        Index terms have been assigned to the content through auto-classification.

        Recommendations

        Comments

        Please enable JavaScript to view thecomments powered by Disqus.

        Information & Contributors

        Information

        Published In

        cover image Computer Vision and Image Understanding
        Computer Vision and Image Understanding  Volume 170, Issue C
        May 2018
        109 pages

        Publisher

        Elsevier Science Inc.

        United States

        Publication History

        Published: 01 May 2018

        Author Tags

        1. Pupil detection
        2. Pervasive
        3. Eye tracking
        4. Embedded

        Qualifiers

        • Research-article

        Contributors

        Other Metrics

        Bibliometrics & Citations

        Bibliometrics

        Article Metrics

        • Downloads (Last 12 months)0
        • Downloads (Last 6 weeks)0
        Reflects downloads up to 02 Oct 2024

        Other Metrics

        Citations

        Cited By

        View all
        • (2024)Using Deep Learning to Increase Eye-Tracking Robustness, Accuracy, and Precision in Virtual RealityProceedings of the ACM on Computer Graphics and Interactive Techniques10.1145/36547057:2(1-16)Online publication date: 17-May-2024
        • (2024)Zero-Shot Segmentation of Eye Features Using the Segment Anything Model (SAM)Proceedings of the ACM on Computer Graphics and Interactive Techniques10.1145/36547047:2(1-16)Online publication date: 17-May-2024
        • (2024)CSA-CNN: A Contrastive Self-Attention Neural Network for Pupil Segmentation in Eye Gaze TrackingProceedings of the 2024 Symposium on Eye Tracking Research and Applications10.1145/3649902.3653351(1-7)Online publication date: 4-Jun-2024
        • (2024)Eye detection and coarse localization of pupil for video-based eye tracking systemsExpert Systems with Applications: An International Journal10.1016/j.eswa.2023.121316236:COnline publication date: 1-Feb-2024
        • (2024)Pistol: Pupil Invisible Supportive Tool in the WildSN Computer Science10.1007/s42979-024-02606-w5:3Online publication date: 21-Feb-2024
        • (2023)Automatic Assessment of Depression and Anxiety through Encoding Pupil-wave from HCI in VR ScenesACM Transactions on Multimedia Computing, Communications, and Applications10.1145/351326320:2(1-22)Online publication date: 25-Sep-2023
        • (2023)Pupil centre’s localization with transformer without real pupilMultimedia Tools and Applications10.1007/s11042-023-14403-382:16(25467-25484)Online publication date: 27-Feb-2023
        • (2022)IoT-Enabled Environment Illuminance Optimization for Augmented RealityAdjunct Proceedings of the 2022 ACM International Joint Conference on Pervasive and Ubiquitous Computing and the 2022 ACM International Symposium on Wearable Computers10.1145/3544793.3560357(112-114)Online publication date: 11-Sep-2022
        • (2022)A Highly Integrated Ambient Light Robust Eye-Tracking Sensor for Retinal Projection AR Glasses Based on Laser Feedback InterferometryProceedings of the ACM on Human-Computer Interaction10.1145/35308816:ETRA(1-18)Online publication date: 13-May-2022
        • (2022)Pupil center detection inspired by multi-task auxiliary learning characteristicMultimedia Tools and Applications10.1007/s11042-022-12278-481:28(40067-40088)Online publication date: 1-Nov-2022
        • Show More Cited By

        View Options

        View options

        Get Access

        Login options

        Media

        Figures

        Other

        Tables

        Share

        Share

        Share this Publication link

        Share on social media