Abstract
Complex and hazardous driving situations often arise with the delayed perception of traffic objects. To automatically detect whether such objects have been perceived by the driver, there is a need for techniques that can reliably recognize whether the driver’s eyes have fixated or are pursuing the hazardous object. A prerequisite for such techniques is the reliable recognition of fixations, saccades, and smooth pursuits from raw eye tracking data. This chapter addresses the challenge of analyzing the driver’s visual behavior in an adaptive and online fashion to automatically distinguish between fixation clusters, saccades, and smooth pursuits.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Berg, D.J., Boehnke, S.E., Marino, R.A., Munoz, D.P., Itti, L.: Free viewing of dynamic stimuli by humans and monkeys. Journal of Vision 9(5), 1–15 (2009)
Bishop, C.M.: Machine Learning and Pattern Recognition. Springer-Verlag, New York, Inc., Secaucus (2006)
Blignaut, P.: Fixation identification: The optimum threshold for a dispersion algorithm. Attention, Perception, & Psychophysics 71(4), 881–895 (2009)
Buswell, G.T.: How people look at pictures. University of Chicago Press, Chicago (1935)
Berger, C., Winkels, M., Lischke, A., Höppner, J.: GazeAlyze: A MATLAB toolbox for the analysis of eye movement data. Behavior Research Methods 44(2), 404–419 (2012)
Camilli, M., Nacchia, R., Terenzi, M., Di Nocera, F.: Astef: A simple tool for examining fixations. Behavior Research Methods 40, 373–382 (2008)
Cornsweet, T.: Visual perception. Academic Press (2012)
Chapman, P.R., Underwood, G.: Visual search of driving situations: Danger and experience. Perception London 27, 951–964 (1998)
Chapman, P., Underwood, G., Roberts, K.: Visual search patterns in trained and untrained novice drivers. Transportation Research Part F: Traffic Psychology and Behaviour 5(2), 157–167 (2002)
Duchowski, A.: Eye tracking methodology: Theory and practice. Springer, London (2007)
Eyetellect. GazeTracker, http://www.eyetellect.com/gazetracker/
Ferrera, V.P.: Task-dependent modulation of the sensorimotor transformation for smooth pursuit eye movements. Journal of Neurophysiology 84(6), 2725–2738 (2000)
Forney Jr., G.D.: The viterbi algorithm. Proceedings of the IEEE 61(3), 268–278 (1973)
Fletcher, L., Zelinsky, A.: Driver inattention detection based on eye gaze-road event correlation. The International Journal of Robotics Research 28(6), 774–801 (2009)
Gitelman, D.R.: ILAB: A program for postexperimental eye movement analysis. Behavioral Research Methods, Instruments and Computers 34(4), 605–612 (2002)
Hayhoe, M., Ballard, D.: Eye movements in natural behavior. Trends in Cognitive Science 9(4), 188–194 (2005)
Henderson, J.M., Brockmole, J.R., Castelhano, M.S., Mack, M.: Visual saliency does not account for eye movements during visual search in real-world scenes. In: Eye movements: A Window on Mind and Brain, pp. 537–562 (2007)
Hamel, J., De Beukelear, S., Kraft, A., Ohl, S., Audebert, H.J., Brandt, S.A.: Age-related changes in visual exploratory behavior in a natural scene setting. Frontiers in Psychology 4(339) (2013)
Horswill, M.S., Marrington, S.A., McCullough, C.M., Wood, J., Pachana, N.A., McWilliam, J., Raikos, M.K.: The hazard perception ability of older drivers. The Journals of Gerontology Series B: Psychological Sciences and Social Sciences 63(4), P212–P218 (2008)
Holmqvist, K., Nyström, M., Andersson, R., Dewhurst, R., Jarodzka, H., Van de Weijer, J.: Eye tracking: A comprehensive guide to methods and measures. Oxford University Press (2011)
Ho, G., Scialfa, C.T., Caird, J.K., Graw, T.: Visual search for traffic signs: The effects of clutter, luminance, and aging. Human Factors: The Journal of the Human Factors and Ergonomics Society 43(2), 194–207 (2001)
Itti, L., Koch, C.: A saliency-based search mechanism for overt and covert shifts of visual attention. Vision Research 40(10-12), 1489–1506 (2000)
Itti, L., Koch, C., Niebur, E.: A model of saliency-based visual attention for rapid scene analysis. IEEE Transactions on Pattern Analysis and Machine Intelligence 20(11), 1254–1259 (1998)
Itti, L.: Quantifying the contribution of low-level saliency to human eye movements in dynamic scenes. Visual Cognition 12(6), 1093–1123 (2005)
Jolliffe, I.T.: Principal Component Analysis. Springer, New York (1986)
Kasneci, E.: Towards the Automated Recognition of Assistance Need for Drivers with Impaired Visual Field. PhD thesis, University of Tübingen, Wilhelmstr. 32, 72074 Tübingen (2013)
Konstantopoulos, P., Chapman, P., Crundall, D.: Exploring the ability to identify visual search differences when observing drivers’ eye movements. Transportation Research Part F: Traffic Psychology and Behaviour 15(3), 378–386 (2012)
Komogortsev, O.V., Jayarathna, S., Koh, D.H., Gowda, S.M.: Qualitative and quantitative scoring and evaluation of the eye movement classification algorithms. In: Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications, ETRA 2010, pp. 65–68. ACM, New York (2010)
Komogortsev, O.V., Karpov, A.: Automated classification and scoring of smooth pursuit eye movements in the presence of fixations and saccades. Behavior Research Methods 45, 203–215 (2013)
Kasneci, E., Kasneci, G., Kübler, T.C., Rosenstiel, W.: The applicability of probabilistic methods to the online recognition of fixations and saccades in dynamic scenes. In: Proceedings of the Symposium on Eye Tracking Research and Applications, ETRA 2014, pp. 323–326. ACM, New York (2014)
Kübler, T.C., Kasneci, E., Rosenstiel, W., Schiefer, U., Nagel, K., Papageorgiou, E.: Stress-indicators and exploratory gaze for the analysis of hazard perception in patients with visual field loss. Transportation Research Part F: Traffic Psychology and Behaviour 24, 231–243 (2014)
Kasneci, E., Sippel, K., Aehling, K., Heister, M., Rosenstiel, W., Schiefer, U., Papageorgiou, E.: Driving with Binocular Visual Field Loss? A Study on a Supervised On-Road Parcours with Simultaneous Eye and Head Tracking. PLoS ONE 9(2), e87470 (2014)
Land, M.F.: Vision, eye movements, and natural behavior. Visual Neuroscience 26(1), 51–62 (2009)
Land, M.F., Tatler, B.W.: Looking and acting: vision and eye movements in natural behaviour. Oxford University Press (2009)
Leigh, R.J., Zee, D.S.: The neurology of eye movements. Oxford University Press (2006)
Markoff, J.: Google cars drive themselves, in traffic. The New York Times 10, A1 (2010)
McConkie, G.W.: Evaluating and reporting data quality in eye movement research. Behavior Research Methods & Instrumentation 13(2), 97–106 (1981)
Maltz, M., Shinar, D.: Eye movements of younger and older drivers. Human Factors: The Journal of the Human Factors and Ergonomics Society 41(1), 15–25 (1999)
Maltz, M., Shinar, D.: Imperfect in-vehicle collision avoidance warning systems can aid drivers. Human Factors: The Journal of the Human Factors and Ergonomics Society 46(2), 357–366 (2004)
Munn, S.M., Stefano, L., Pelz, J.B.: Fixation-identification in dynamic scenes: comparing an automated algorithm to manual coding. In: Proceedings of the 5th Symposium on Applied Perception in Graphics and Visualization, APGV 2008, pp. 33–42. ACM, New York (2008)
Minka, T., Winn, J.M., Guiver, J.P., Knowles, D.A.: Infer.NET 2.5. Microsoft Research Cambridge (2012), http://research.microsoft.com/infernet
Nagayama, Y.: Role of visual perception in driving. IATSS Research 2, 64–73 (1978)
Nuthmann, A., Henderson, J.M.: Object-based attentional selection in scene viewing. Journal of Vision 10(8), 20 (2010)
Noton, D., Stark, L.W.: Eye movements and visual perception. Scientific American 224(6), 34–43 (1971)
Pradhan, A.K., Hammel, K.R., DeRamus, R., Pollatsek, A., Noyce, D.A., Fisher, D.L.: Using Eye Movements To Evaluate Effects of Driver Age on Risk Perception in a Driving Simulator. Human Factors: The Journal of the Human Factors and Ergonomics Society 47(4), 840–852 (2005)
Pomerleau, D.A.: ALVINN: An autonomous land vehicle in a neural network. In: Touretzky, D.S. (ed.) Advances in Neural Information Processing Systems 1, pp. 305–313. Morgan Kaufmann, San Francisco (1989)
Privitera, C.M., Stark, L.W.: Algorithms for defining visual regions-of-interest: Comparison with eye fixations. IEEE Transactions on Pattern Analysis and Machine Intelligence 22(9), 970–982 (2000)
Privitera, C.M., Stark, L.W.: Scanpath theory, attention, and image processing algorithms for predicting human eye fixations. In: Itti, L., Rees, G., Tsotsos, J. (eds.) Neurobiology of Attention, pp. 269–299 (2005)
Schütz, A.C., Braun, D.I., Gegenfurtner, K.R.: Eye movements and perception: a selective review. Journal of Vision 11(9), 1–30 (2011)
Santella, A., De Carlo, D.: Robust clustering of eye movement recordings for quantification of visual interest. In: Proceedings of the 2004 Symposium on Eye Tracking Research & Applications, ETRA 2004, pp. 27–34. ACM, New York (2004)
Seeing Machines Inc. faceLab 5, http://www.seeingmachines.com/product/facelab/
SensoMotoric Instruments GmbH. SMI BeGaze Eye Tracking Analysis Software, http://www.smivision.com/en/gaze-and-eye-tracking-systems/products/begaze-analysis-software.html
Salvucci, D., Goldberg, J.: Identifying fixations and saccades in eye-tracking protocols. In: Proceedings of the 2000 Symposium on Eye Tracking Tesearch & Applications, ETRA 2000, pp. 71–78. ACM, New York (2000)
Sauter, D., Martin, B.J., Di Renzo, N., Vomscheid, C.: Analysis of eye tracking movements using innovations generated by a Kalman filter. Medical and biological Engineering and Computing 29(1), 63–69 (1991)
Summala, H., Nieminen, T., Punto, M.: Maintaining lane position with peripheral vision during in-vehicle tasks. Human Factors: The Journal of the Human Factors and Ergonomics Society 38(3), 442–451 (1996)
SR Research Ltd. EyeLink 1000 and EyeLink II, http://www.sr-research.com/index.html .
Shic, F., Scassellati, B., Chawarska, K.: The incomplete fixation measure. In: Proceedings of the 2008 Symposium on Eye Tracking Research & Applications, ETRA 2008, pp. 111–114. ACM, New York (2008)
Turano, K.A., Geruschat, D.R., Baker, F.H.: Oculomotor strategies for the direction of gaze tested with a real-world activity. Vision Research 43, 333–346 (2003)
Tafaj, E., Hempel, S., Heister, M., Aehling, K., Schaeffel, F., Dietzsch, J., Rosenstiel, W., Schiefer, U.: A New Method for Assessing the Exploratory Field of View (EFOV). In: Stacey, D., SoléCasals, J., Fred, A.L.N., Gamboa, H. (eds.) HEALTHINF 2013, pp. 5–11. SciTePress (2013)
Tatler, B.W., Hayhoe, M.M., Land, M.F., Ballard, D.H.: Eye guidance in natural vision: reinterpreting salience. Journal of Vision 11(5), 5 (2011)
Tafaj, E., Kübler, T.C., Kasneci, G., Rosenstiel, W., Bogdan, M.: Online classification of eye tracking data for automated analysis of traffic hazard perception. In: Mladenov, V., Koprinkova-Hristova, P., Palm, G., Villa, A.E.P., Appollini, B., Kasabov, N. (eds.) ICANN 2013. LNCS, vol. 8131, pp. 442–450. Springer, Heidelberg (2013)
Tafaj, E., Kübler, T., Peter, J., Schiefer, U., Bogdan, M., Rosenstiel, W.: Vishnoo - an open-source software for vision research. In: Proceedings of the 24th IEEE International Symposium on Computer-Based Medical Systems, CBMS 2011, pp. 1–6. IEEE (2011)
Tafaj, E., Kasneci, G., Rosenstiel, W., Bogdan, M.: Bayesian online clustering of eye movement data. In: Proceedings of the Symposium on Eye Tracking Research & Applications, ETRA 2012, pp. 285–288. ACM, New York (2012)
Tobii Technology AB. Eye Tracking for Research and Analysis, http://www.tobii.com/en/eye-tracking-research/global/ .
Urmson, C., Anhalt, J., Bagnell, D., Baker, C., Bittner, R., et al.: Autonomous driving in urban environments: Boss and the urban challenge. Journal of Field Robotics 25(8), 425–466 (2008)
Underwood, G., Chapman, P., Bowden, K., Crundall, D.: Visual search while driving: skill and awareness during inspection of the scene. Transportation Research Part F: Traffic Psychology and Behaviour 5(2), 87–97 (2002)
Urruty, T., Lew, S., Ihadaddene, N., Simovici, D.A.: Detecting eye fixations by projection clustering. In: ACM Transactions on Multimedia Computing, Communications, and Applications (TOMCCAP), vol. 3(4), pp. 1–20 (2007)
Underwood, G., Phelps, N., Wright, C., Van Loon, E., Galpin, A.: Eye fixation scanpaths of younger and older drivers in a hazard perception task. Ophthalmic and Physiological Optics 25(4), 346–356 (2005)
Vidal, M., Bulling, A., Gellersen, H.: Detection of smooth pursuits using eye movement shape features. In: Proceedings of the Symposium on Eye Tracking Research and Applications, ETRA 2012, pp. 177–180. ACM, New York (2012)
Velichkovsky, B.M., Rothert, A., Kopf, M., Dornhöfer, S.M., Joos, M.: Towards an express-diagnostics for level of processing and hazard perception. Transportation Research Part F: Traffic Psychology and Behaviour 5(2), 145–156 (2002)
Widdel, H.: Operational problems in analysing eye movements. In: Gale, A.G., Johnson, F. (eds.) Theoretical and Applied Aspects of Eye Movement Research Selected/Edited Proceedings of The Second European Conference on Eye Movements. Advances in Psychology, vol. 22, pp. 21–29. North-Holland (1984)
Wooding, D.S.: Fixation maps: quantifying eye-movement traces. In: Proceedings of the Eye Tracking Research and Applications, pp. 31–36 (2002)
Yarbus, A.L.: Eye movements and vision. Plenum Press, New York (1967)
Zeeb, E.: Daimler’s New Full-Scale, High-dynamic Driving Simulator–A Technical Overview. In: Proceedings of the Driving Simulator Conference Europe, pp. 157–165. Institut national de recherche sur les transports et leur sécurité (2010)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2015 Springer International Publishing Switzerland
About this paper
Cite this paper
Kasneci, E., Kasneci, G., Kübler, T.C., Rosenstiel, W. (2015). Online Recognition of Fixations, Saccades, and Smooth Pursuits for Automated Analysis of Traffic Hazard Perception. In: Koprinkova-Hristova, P., Mladenov, V., Kasabov, N. (eds) Artificial Neural Networks. Springer Series in Bio-/Neuroinformatics, vol 4. Springer, Cham. https://doi.org/10.1007/978-3-319-09903-3_20
Download citation
DOI: https://doi.org/10.1007/978-3-319-09903-3_20
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-09902-6
Online ISBN: 978-3-319-09903-3
eBook Packages: EngineeringEngineering (R0)