Nothing Special   »   [go: up one dir, main page]

Skip to main content
Log in

Driver’s eye blinking detection using novel color and texture segmentation algorithms

  • Regular Paper
  • Control Applications
  • Published:
International Journal of Control, Automation and Systems Aims and scope Submit manuscript

Abstract

In this paper we propose a system that measures eye blinking rate and eye closure duration. The system consists of skin-color segmentation, facial features segmentation, iris positioning and blink detection. The proposed skin-segmentation procedure is based on a neural network approximation of a RGB skin-color histogram. This method is robust and adaptive to any skin-color training set. The largest remaining skin-color region among skin-color segmentation results is further segmented into open/closed eyes, lips, nose, eyebrows, and the remaining facial regions using a novel texture segmentation algorithm. The segmentation algorithm classifies pixels according to the highest probability among the estimated facial feature class probability density functions (PDFs). The segmented eye regions are analyzed with the Circular Hough transform with the purpose of finding iris candidates. The finial iris position is selected according to the location of the maximum correlation value obtained from correlation with a predefined mask. The positions of irises and eye states are monitored through time to estimate eye blinking frequency and eye closure duration. The method of the driver drowsiness detection using these parameters is illustrated. The proposed system is tested on CCD and CMOS cameras under different environmental conditions and the experimental results show high system performance.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. J. Horne and L. A. Reyner, “Sleep related vehicle accidents,” British Medical Journal, vol. 310, pp. 565–567, 1995.

    Article  Google Scholar 

  2. The Role of Driver Fatigue in Commercial Road Transport Crashes, European Transport Safety Council, 2001.

  3. Driver Fatigue and Road Accidents, The Royal Society for the Prevention of Accidents, Birmingham, England, 2001.

  4. L. Hartley, T. Horberry, and N. Mabbott, Review of Fatigue Detection and Prediction Technologies, Institute for Research in Safety and Transport, Murdoch University, Western Australia and Gerald Krueger — Krueger Ergonomics Consultants, Virginia, USA, 2000.

    Google Scholar 

  5. S. Box, “New Data from VTTI provides insight into cell phone use and driving distraction,” Virginia Tech Transportation Institute, 2009.

  6. S. Zhao and R.-R. Grigat, “Robust eye detection under active infrared illumination,” Proc. of the 18th International Conference on Pattern Recognition, pp. 481–484, 2006.

  7. Q. Ji and X. Yang, “Real-time eye, gaze, and face pose tracking for monitoring driver vigilance,” Real-Time Imaging, vol. 8, pp. 357–377, 2002.

    Article  MATH  Google Scholar 

  8. J. P. Batista, “A real-time driver visual attention monitoring system,” Proc. of Iberian Conference on Pattern Recognition and Image Analysis, vol. 3522, pp. 200–208, 2005.

    Google Scholar 

  9. R. I. Hammoud, G. Witt, R. Dufour, A. Wilhelm, and T. Newman, “On driver eye closure recognition for commercial vehicles,” Proc. of SAE Commercial Vehicles Engineering Congress and Exhibition, Chicago, IL, USA, 2008.

  10. D. Pitts, A. Cullen, and P. Dayhew-Barker, Determination of ocular threshold levels for infrared radiation cataractogenesis: NIOSH research report, DHHS publication; no. (NIOSH) 80-121, DHHS publication — no. (NIOSH) 80–121, 1980.

  11. W. Rong-ben, G. Ke-you, S. Shu-ming, and C. Jiang-wei, “A monitoring method of driver fatigue behavior based on machine vision,” Proc. of Intelligent Vehicles Symposium, pp. 110–113, 2003.

  12. C. Phil and G. Christos, “A fast skin region detector,” ESC Division Research, 2005.

  13. U. Tariq, H. Jamal, M. Z. J. Shahid, and M. U. Malik, “Face detection in color images, a robust and fast statistical approach,” Proc. of INMIC, pp. 73–78, 2004.

  14. A. Hamdy, M. Elmahdy, and M. Elsabrouty, “Face detection using PCA and skin-tone extraction for drowsy driver application,” Proc. of 5th International Conference on Information & Communications Technology, pp. 135–137, 2007.

  15. D. Butler, S. Sridharan, and V. Chandran, “Chromatic colour spaces for skin detection using GMMs,” Inter. Conf. on Acoustics, Speech, and Signal Processing, vol. 4, pp. 3620–3623, 2002.

    Google Scholar 

  16. I. Naseem and M. Deriche, “Robust human face detection in complex color images,” Proc. of IEEE International Conference on Image Processing, vol. 2, pp. 338–341, 2005.

    Google Scholar 

  17. O. J. Hernandez and M. S. Kleiman, “Face recognition using multispectral random field texture models, color content, and biometric features,” Proc. of Applied Imagery and Pattern Recognition Workshop, p. 209, 2005.

  18. C. Chen and S.-P. Chiang, “Detection of human faces in colour images,” IEE Proceedings on Vision, Image and Signal Processing, vol. 144, pp. 384–388, 1997.

    Article  Google Scholar 

  19. M.-J. Seow, D. Valaparla, and V. K. Asari, “Neural network based skin color model for face detection,” Proc. of Applied Imagery Pattern Recognition Workshop, pp. 141–145, 2003.

  20. H. Sahbi and N. Boujemaa, “From coarse to fine skin and face detection,” Proc. of the 8th ACM International Conference on Multimedia, 2000.

  21. A. Lenskiy and J.-S. Lee, “Face and iris detection algorithm based on SURF and circular Hough transform,” Signal Processing, The Institute of Electronics Engineers of Korea, vol. 47, 2010.

  22. H. Jee, K. Lee, and S. Pan, “Eye and face detection using SVM,” Proc. of Conference on Intelligent Sensors, Sensor Networks and Information, pp. 577–580, 2004.

  23. H. A. Rowley, S. Baluja, and T. Kanade, “Neural network-based face detection,” IEEE Trans. on Pattern Analysis and Machine Intelligence, vol. 20, pp. 23–38, 2004.

    Article  Google Scholar 

  24. R. Motwani, M. Motwani, and F. Harris, “Eye detection using wavelets and ANN,” Proc. of GSPx, 2004.

  25. K. He, J. Zhou, Y. Song, and Q. Qiao, “Multiresolution eye location from image,” Proc. of Signal Processing, vol. 2, pp. 901–905, 2004.

    Google Scholar 

  26. K.-H. Cheung, J. You, W.-K. Kong, and D. D. Zhang, “A study of aggregated 2D Gabor features on appearance-based face recognition,” Proc. of Int. Conf. on Image and Graphics, pp. 310–313, 2004.

  27. N. Gourier, D. Hall, and J. L. Crowley, “Facial features detection robust to pose, illumination and identity,” Proc. of IEEE International Conference on Systems, Man and Cybernetics, pp. 617–622, 2004.

  28. Z.-H. Zhou and X. Geng, “Projection functions for eye detection,” Pattern Recognition, vol. 37, pp. 1049–1056, 2004.

    Article  MATH  Google Scholar 

  29. L. Daw-Tung and Y. Chen-Ming, “Real-time eye detection using face circle fitting and dark-pixel filtering,” Proc. of IEEE International Conference on Multimedia and Expo, vol. 2, pp. 1167–1170, 2004.

    Google Scholar 

  30. H.-J. Kim and W.-Y. Kim, “Eye detection in facial images using zernike moments with SVM,” ETRI Journal, vol. 30, pp. 335–337, 2008.

    Article  MATH  Google Scholar 

  31. A. A. Lenskiy and J.-S. Lee, “Terrain images segmentation in infra-red spectrum for autonomous robot navigation,” Proc. of IFOST 2010, Ulsan, Korea, pp. 33–37, 2010.

  32. A. A. Lenskiy and J.-S. Lee, “Rugged terrain segmentation based on salient features,” Proc. of International Conference on Control, Automation and Systems, Gyeonggi-do, Korea, 2010.

  33. T. D’Orazio, M. Leo, C. Guaragnella, and A. Distante, “A visual approach for driver inattention detection,” Pattern Recognition, vol. 40, pp. 2341–2355, 2007.

    Article  MATH  Google Scholar 

  34. A. A. Lenskiy and J.-S. Lee, “Machine learning algorithms for visual navigation of unmanned ground vehicles,” in Computational Modeling and Simulation of Intellect: Current State and Future Perspectives, B. Igelnik, Ed., ed: IGI Global, 2011.

  35. H. Bay, A. Ess, T. Tuytelaars, and L. V. Gool, “Speeded-up robust features (SURF),” Computer Vision and Image Understanding, vol. 110, pp. 346–359, 2008.

    Article  Google Scholar 

  36. R. O. Duda and P. E. Hart, “Use of the hough transformation to detect lines and curves in pictures,” Communications of Association for Computing Machinery, vol. 15, pp. 11–15, 1972.

    Article  Google Scholar 

  37. K. Deb, A. Vavilin, J.-W. Kim, and K.-H. Jo, “Vehicle license plate tilt correction based on the straight line fitting method and minimizing variance of coordinates of projection points,” International Journal of Control, Automation, and Systems, vol. 8, pp. 975–984, 2010.

    Article  Google Scholar 

  38. R. G. Brown and P. Y. C. Hwang, Introduction to Random Signals and Applied Kalman Filtering with MATLAB Exercises and Solutions, John Wiley & Sons, 1997.

  39. P. Caffier, U. Erdmann, and P. Ullsperger, “Experimental evaluation of eye-blink parameters as a drowsiness measure,” Eur. J. Appl. Physiol, vol. 89, pp. 319–325, 2003.

    Article  Google Scholar 

  40. G. R. David Dinges, Perclos: A Valid Psychophysiological Measure of Alertness As Assessed by Psychomotor Vigilance, Federal Highway Administration, Office of Motor Carriers, Indianapolis, IN, Tech. Rep. MCRT-98-006, 1998.

    Google Scholar 

  41. M. J. Flores, J. M. Armingol, and A. de la Escalera, “Real-time drowsiness detection system for and intelligent vehicle,” Proc. of IEEE Intelligent Vehicles Symposium, pp. 637–642, 2008.

  42. A. A. Lenskiy and J.-S. Lee, “Detecting eyes and lips using neural networks and SURF features,” in Cross-disciplinary Applications of Artificial Intelligence and Pattern Recognition, Advancing Technologies, Vijay Kumar Mago and N. Bhatia, Eds., ed: IGI Global, 2012.

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jong-Soo Lee.

Additional information

Recommended by Editorial Board member Dong-Joong Kang under the direction of Editor Young-Hoon Joo.

This work was supported by the 2006 Research Fund of University of Ulsan.

Artem A. Lenskiy received his Master’s degree in Digital Signal Processing and Data Analysis in 2004 from the Novosibirsk State Technical University, Russia and a Ph.D. degree in EE from the University of Ulsan, Korea in 2010. He is currently lecturing at Korea University of Technology and Education (Koreatech), Korea. His research interests include computer vision problems and various applications of processes with long range dependence.

Jong-Soo Lee received his Bachelors degree in Electrical Engineering in 1973 from Seoul National University and his M.S. degree in 1981. In 1985 he was awarded his Ph.D. from Virginia Polytechnic Institute and State University, Blacksburg, USA. He is currently working in the area of multimedia at the University of Ulsan in Korea. His research interests include development of personal English cultural experience programs using multimedia and usability interface techniques to facilitate the acquisition of English language skills by Koreans. He is also working on vocal tract modeling from speech data based on fluid dynamics.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Lenskiy, A.A., Lee, JS. Driver’s eye blinking detection using novel color and texture segmentation algorithms. Int. J. Control Autom. Syst. 10, 317–327 (2012). https://doi.org/10.1007/s12555-012-0212-0

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12555-012-0212-0

Keywords

Navigation