Abstract
Eye gaze tracking plays an important role in various fields including, human computer interaction, virtual and augmented reality and in identifying effective marketing solutions in affective manner. This paper addresses real-time eye gaze estimation problem using low resolution ordinary camera available in almost every desktop environment as opposed to gaze tracking technologies requiring costly equipment and infrared light sources. In this research, a camera based non-invasive technique has been proposed for tracking and recording gaze points. Further, the proposed framework was used to analyze gaze behavior of users on advertisements displayed on social media website. Eye gaze fixations data of 32 participants were recorded, and gaze patterns were plotted using Heat maps. In addition, the gaze driven interface was designed for virtual interaction tasks to assess the performance, and usability of our proposed framework.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Ahmed M, Laskar RH (2021) Evaluation of accurate iris center and eye corner localization method in a facial image for gaze estimation. Multim Syst 1–20
Anderson C, Chang AM, Sullivan JP, Ronda JM, Czeisler CA (2013) Assessment of drowsiness based on ocular parameters detected by infrared reflectance oculography. J Clin Sleep Med 9(9):907–920. https://doi.org/10.5664/jcsm.2992
Bamidele A, Kamardin K, Syazarin N, Mohd S, Shafi I, Azizan A, Aini N, Mad H (2019) Non-intrusive driver drowsiness detection based on face and eye tracking. Int J Adv Comput Sci Appl 10:549–569
Biswas P, Langdon P (2011) A new input system for disabled users involving eye gaze tracker and scanning interface. J Assist Technol. https://doi.org/10.1108/17549451111149269
Cai H, Lin Y (2012) An integrated head pose and eye gaze tracking approach to non-intrusive visual attention measurement for wide FOV simulators. Virtual Reality 16(1):25–32. https://doi.org/10.1007/s10055-010-0171-9.pdf
Cecotti H (2016) A multimodal gaze-controlled virtual keyboard. IEEE Trans Human-Mach Syst 46(4):601–616. https://doi.org/10.1109/THMS.2016.2537749
Cerrolaza JJ, Villanueva A, Cabeza R (2012) Study of polynomial mapping functions in video-oculography eye trackers. ACM Trans Comput-Human Interact 19(2):1–25. https://doi.org/10.1145/2240156.2240158
Cheng Y, Zhang X, Lu F, Sato Y (2020) Gaze estimation by exploring two-eye asymmetry. IEEE Trans Image Process 29:5259–5272. https://doi.org/10.1109/TIP.2020.2982828
Choe KW, Blake R, Lee SH (2016) Pupil size dynamics during fixation impact the accuracy and precision of video-based gaze estimation. Vision Res. https://doi.org/10.1016/j.visres.2014.12.018
Cognolato M, Atzori M, Müller H (2018) Head-mounted eye gaze tracking devices: an overview of modern devices and recent advances. J Rehab Assistive Technol Eng 5.https://doi.org/10.1177/2055668318773991
Dimpfel W (2015) Neuromarketing: neurocode-tracking in combination with eye-tracking for quantitative objective assessment of TV commercials. J Behav Brain Sci. https://doi.org/10.4236/jbbs.2015.54014
Dongare H, Shah S (2016) Eye gaze tracking and eyes off the road detection for traffic safety on raspberry Pi. Int J Innov Res Elect Electron Instrum Control Eng 4(6):154–157. https://doi.org/10.17148/IJIREEICE.2016.4636
Drakopoulos P, Koulieris G, Mania K (2021) Eye tracking interaction on unmodified mobile VR headsets using the selfie camera. ACM Trans Appl Percep 18(3):1–20. https://doi.org/10.1145/3456875
Ebisawa Y, Fukumoto K (2013) Head-free, remote eye-gaze detection system based on pupil-corneal reflection method with easy calibration using two stereo-calibrated video cameras. IEEE Trans Biomed Eng 60(10):2952–2960. https://doi.org/10.1109/TBME.2013.2266478
Facebook MAU worldwide|Statista (2020) https://www.statista.com/statistics/264810/number-of-monthly-active-facebook-users-worldwide/
Farnsworth B (2019) Eye tracker prices. https://imotions.com/blog/eye-tracker-prices/
Fenko A, Nicolaas I, Galetzka M (2018) Does attention to health labels predict a healthy food choice? An eye-tracking study. Food Quality Prefer. https://doi.org/10.1016/j.foodqual.2018.05.012
Georgakarakou C, Riskos K, Tsourvakas G, Yfantidou I (2020) What features of green products packaging are more eye catching? An eye-tracking exploratory study about organic agricultural products. Int J Technol Mark 14(2):154–180. https://doi.org/10.1504/IJTMKT.2020.110124
George A, Routray A (2016a) Fast and accurate algorithm for eye localization for gaze tracking in low resolution images 1–12. https://doi.org/10.1049/iet-cvi.2015.0316
George A, Routray A (2016b) Real-time eye gaze direction classification using convolutional neural network. In: 2016b international conference on signal processing and communications (SPCOM), pp 1–5
Glaholt MG, Reingold EM (2011) Eye movement monitoring as a process tracing methodology in decision making research. J Neurosci Psychol Econ 4(2):125
Guestrin ED, Eizenman M (2006) General theory of remote gaze estimation using the pupil center and corneal reflections. IEEE Trans Biomed Eng 53(6):1124–1133. https://doi.org/10.1109/TBME.2005.863952
Hansen DW, Ji Q (2010) In the eye of the beholder: a survey of models for eyes and gaze. IEEE Trans Pattern Anal Mach Intell 32(3):478–500. https://doi.org/10.1109/TPAMI.2009.30
Hornof A, Cavender A, Hoselton R (2003) Eyedraw: a system for drawing pictures with eye movements. ACM SIGACCESS Access Comput 86–93
Huang Q, Veeraraghavan A, Sabharwal A (2017) TabletGaze: dataset and analysis for unconstrained appearance-based gaze estimation in mobile tablets. Mach vis Appl 28(5–6):445–461. https://doi.org/10.1007/s00138-017-0852-4
Hwang YM, Lee KC (2020) An eye-tracking paradigm to explore the effect of online consumers’ emotion on their visual behaviour between desktop screen and mobile screen. Behav Inf Technol. https://doi.org/10.1080/0144929X.2020.1813330
Ince IF, Kim JW (2011) A 2D eye gaze estimation system with low- resolution webcam images. EURASIP J Adv Sig Process 40(1):1–11. https://doi.org/10.1186/1687-6180-2011-40
Jianfeng L, Shigang L (2014) Eye-model-based gaze estimation by RGB-D camera. IEEE Comput Soc Conf Comput Vis Pattern Recogn Workshops 592–596. https://doi.org/10.1109/CVPRW.2014.93
Kang Z, Landry SJ (2015) An eye movement analysis algorithm for a multielement target tracking task: maximum transition-based agglomerative hierarchical clustering. IEEE Trans Human-Mach Syst 45(1):13–24. https://doi.org/10.1109/THMS.2014.2363121
Kaur A (2021) Wheelchair control for disabled patients using EMG/EOG based human machine interface: a review. J Med Eng Technol 45(1):61–74. https://doi.org/10.1080/03091902.2020.1853838
Klaib AF, Alsrehin NO, Melhem WY, Bashtawi HO (2019) IoT smart home using eye tracking and voice interfaces for elderly and special needs people. J Commun 14(7):614–621
Krafka K, Khosla A, Kellnhofer P, Kannan H, Bhandarkar S, Matusik W, Torralba A (2016) Eye tracking for everyone. In: 2016 IEEE conference on computer vision and pattern recognition (CVPR), pp 2176–2184. https://doi.org/10.1109/CVPR.2016.239
Kumar D, Sharma A (2016) Electrooculogram-based virtual reality game control using blink detection and gaze calibration. In: 2016 International conference on advances in computing, communications and informatics, ICACCI 2016, pp 2358–2362. https://doi.org/10.1109/ICACCI.2016.7732407
Kurilovas E, Kubilinskiene S (2020) Lithuanian case study on evaluating suitability, acceptance and use of IT tools by students—an example of applying technology enhanced learning research methods in higher education. Comput Human Behav 107:106274
Laddi A, Prakash NR (2018) An accurate and simple approach to detect eye centers in low resolution face images. IETE J Res. https://doi.org/10.1080/03772063.2017.1367264
Larumbe-Bergera A, Garde G, Porta S, Cabeza R (2021) Accurate pupil center detection in off-the-shelf eye tracking systems using convolutional neural networks. Sensors 21(20), 6847. https://www.mdpi.com/1424-8220/21/20/6847
Liu SS, Rawicz A, Ma T, Zhang C, Lin K, Rezaei S, Wu E (2010) An eye-gaze tracking and human computer interface system for people with ALS and other locked-in diseases. CMBES Proc 1–3
Lu F, Sugano Y, Okabe T, Sato Y (2014) adaptive linear regression for appearance-based gaze estimation. IEEE Trans Pattern Anal Mach Intell 36(10):2033–2046. https://doi.org/10.1109/TPAMI.2014.2313123
Mason MF, Hood BM, Macrae CN (2004) Look into my eyes: gaze direction and person memory. Memory 12(5):637–643. https://doi.org/10.1080/09658210344000152
Mazhar O, Shah TA, Khan MA, Tehami S (2015) A real-time webcam based eye ball tracking system using MATLAB. In: 2015 IEEE 21st international symposium for design and technology in electronic packaging, SIITME 2015, October, 139–142. https://doi.org/10.1109/SIITME.2015.7342312
Mele ML, Federici S (2012) Gaze and eye-tracking solutions for psychological research. Cogn Process 13(1):261–265. https://doi.org/10.1007/s10339-012-0499-z
Meng C, Zhao X (2017) Webcam-based eye movement analysis using CNN. IEEE Access 5:19581–19587
Mimura Y, Tsuchiya T, Moriyama K, Murata K, Takasuka S (2020) UX design for mobile application of e-commerce site by using kansei interface. Adv Intell Syst Computi 1202 AISC 641–647. https://doi.org/10.1007/978-3-030-51194-4_84
Modi N, Singh J (2020) A survey of research trends in assistive technologies using information modelling techniques. Disab Rehab Assist Technol 1–19
Modi N, Singh J (2021) A review of various state of art eye gaze estimation techniques. Adv Comput Intell Commun Technol 501–510
Mou J, Shin D (2018) Effects of social popularity and time scarcity on online consumer behaviour regarding smart healthcare products: an eye-tracking approach. Comput Hum Behav. https://doi.org/10.1016/j.chb.2017.08.049
Nagamatsu T, Sugano R, Iwamoto Y, Kamahara J, Tanaka N (2011) User-calibration-free gaze estimation method using a binocular 3D eye model. IEICE Trans Inf Syst E94-D(9):1817–1829. https://doi.org/10.1587/transinf.E94.D.1817
Ou W-L, Kuo T-L, Chang C-C, Fan C-P (2021) Deep-learning-based pupil center detection and tracking technology for visible-light wearable gaze tracking devices. Appl Sci 11(2):851
Padilla R, Filho CC, Costa MGF (2012) Evaluation of haar cascade classifiers designed for face detection. World Acad Sci Eng Technol 64:362–365. https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.231.9184&rep=rep1&type=pdf
Pai YS, Dingler T, Kunze K (2019) Assessing hands-free interactions for VR using eye gaze and electromyography. Virtual Reality 23(2):119–131. https://doi.org/10.1007/S10055-018-0371-2
Pai YS, Bait ML, Lee J, Xu J, Peiris RL, Woo W, Billinghurst M, Kunze K (2021) NapWell: an EOG-based sleep assistant exploring the effects of virtual reality on sleep onset. Virtual Real 1–15. https://doi.org/10.1007/S10055-021-00571-W
Pantic M, Pentland A, Nijholt A, Huang TS (2007) Human computing and machine understanding of human behavior: a survey. In: Lecture notes in computer science (including subseries lecture notes in artificial intelligence and lecture notes in bioinformatics), pp 47–71. https://doi.org/10.1007/978-3-540-72348-6_3
Pastel S, Chen C-H, Martin L, Naujoks M, Petri K, Witte K (2021) Comparison of gaze accuracy and precision in real-world and virtual reality. Virtual Reality 25:175–189. https://doi.org/10.1007/s10055-020-00449-3
Perez-Llamas C, Lopez-Bigas N (2011) Gitools: analysis and visualisation of genomic data using interactive heat-maps. PLoS ONE 6(5). https://doi.org/10.1371/journal.pone.0019541
Rahman Z, Pu Y, Aamir M, Ullah F (2018) A framework for fast automatic image cropping based on deep saliency map detection and gaussian filter. Int J Comput Appl 41(3):207–217. https://doi.org/10.1080/1206212X.2017.1422358
Raj R, Joseph N (2016) Keypoint extraction using SURF algorithm for CMFD. Proc Comput Sci 93:375–381
Satriya T, Wibirama S, Ardiyanto I (2016) Robust pupil tracking algorithm based on ellipse fitting. In: 2016 International symposium on electronics and smart devices, ISESD 2016, pp 253–257. https://doi.org/10.1109/ISESD.2016.7886728
Sattar H, Fritz M, Bulling A (2020) Deep gaze pooling: Inferring and visually decoding search intents from human gaze fixations. Neurocomputing 387:369–382. https://doi.org/10.1016/j.neucom.2020.01.028
Scherer MJ, Federici S (2015) Why people use and don’t use technologies: introduction to the special issue on assistive technologies for cognition/cognitive support technologies. NeuroRehabilitation 37(3):315–319. https://doi.org/10.3233/NRE-151264
Schneider T, Schauerte B, Stiefelhagen R (2014) Manifold alignment for person independent appearance-based gaze estimation. Proc Int Conf Pattern Recogn. https://doi.org/10.1109/ICPR.2014.210
Singh J, Modi N (2019) Use of information modelling techniques to understand research trends in eye gaze estimation methods: an automated review. Heliyon 5(12):e03033
Skodras E, Kanas VG, Fakotakis N (2015) On visual gaze tracking based on a single low cost camera. Signal Process Image Commun 36:29–42. https://doi.org/10.1016/j.image.2015.05.007
Spiller M, Liu YH, Hossain MZ, Gedeon T, Geissler J, Nürnberger A (2021) Predicting visual search task success from eye gaze data as a basis for user-adaptive information visualization systems. ACM Trans Interact Intell Syst 11(2):1–25. https://doi.org/10.1145/3446638
Sugano Y, Matsushita Y, Sato Y (2013) Appearance-based gaze estimation using visual saliency. IEEE Trans Pattern Anal Mach Intell 35(2):329–341. https://doi.org/10.1109/TPAMI.2012.101
Sugano Y, Matsushita Y, Sato Y (2014) Learning-by-synthesis for appearance-based 3D gaze estimation. In: Proceedings of the IEEE computer society conference on computer vision and pattern recognition, pp 1821–1828. https://doi.org/10.1109/CVPR.2014.235
Teiwes W, Merfeld DM, Young LR, Clarke AH (2020) Comparison of the scleral search coil and video-oculography techniques for three-dimensional eye movement measurement. In: Three-dimensional kinematics of eye, head and limb movements. Routledge, pp 429–443
Tobii (2010) Tobii eye tracking—an introduction to eye tracking and Tobii eye trackers. Technology
Toreini P, Langner M, Maedche A (2020) Using eye-tracking for visual attention feedback. In: Information systems and neuroscience. Springer, pp 261–270
Turner J, Bulling A, Gellersen H (2012) Extending the visual field of a head-mounted eye tracker for pervasive eye-based interaction. In: Proceedings of the symposium on eye tracking research and applications—ETRA ’12, pp 269–272. https://doi.org/10.1145/2168556.2168613
Tzafilkou K, Protogeros N (2017) Diagnosing user perception and acceptance using eye tracking in web-based end-user development. Comput Hum Behav. https://doi.org/10.1016/j.chb.2017.02.035
Valenti R, Gevers T (2008) Accurate eye center location and tracking using isophote curvature. In: 26th IEEE conference on computer vision and pattern recognition, CVPR. https://doi.org/10.1109/CVPR.2008.4587529
Valliappan N, Dai N, Steinberg E, He J (2020) Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nat Commun 11(1):1–12. https://www.nature.com/articles/s41467-020-18360-5
Valtakari NV, Hooge ITC, Viktorsson C, Nyström P, Falck-Ytter T, Hessels RS (2021) Eye tracking in human interaction: possibilities and limitations. Behav Res Methods 1–17
Vicente F, Huang Z, Xiong X, De La Torre F, Zhang W, Levi D (2015) Driver gaze tracking and eyes off the road detection system. IEEE Trans Intell Transp Syst. https://doi.org/10.1109/TITS.2015.2396031
Wang Y, Shen T, Yuan G, Bian J, Fu X (2016) Appearance-based gaze estimation using deep features and random forest regression. Knowl-Based Syst 110:293–301
Wen J, Lin Z, Liu X, Xiao SH, Li Y (2021) The interaction effects of online reviews, brand, and price on consumer hotel booking decision making. J Travel Res 60(4):846–859
Wilson PI, Fernandez J (2006) Facial feature detection using haar classifiers. J Comput Sci Coll 21(4):127–133. https://doi.org/10.1109/CVPR.2001.990517
Wood E, Bulling A (2014) EyeTab. In: Proceedings of the symposium on eye tracking research and applications—ETRA ’14, pp, 207–210. https://doi.org/10.1145/2578153.2578185
Wood E, Baltrušaitis T, Morency LP, Robinson P, Bulling A (2016) A 3D morphable eye region model for gaze estimation. Lecture notes in computer science (including subseries lecture notes in artificial intelligence and lecture notes in bioinformatics), pp 297–313. https://doi.org/10.1007/978-3-319-46448-0_18
Wu YL, Yeh CT, Hung WC, Tang CY (2014) Gaze direction estimation using support vector machine with active appearance model. Multim Tools Appl 70(3):2037–2062. https://doi.org/10.1007/s11042-012-1220-z
Xiong X, Liu Z, Cai Q, Zhang Z (2014) Eye gaze tracking using an rgbd camera: a comparison with a RGB solution. In: International joint conference on pervasive and ubiquitous computing: adjunct publication, pp 1113–1121. https://doi.org/10.1145/2638728.2641694
Xiong J, Zuo M (2020) What does existing NeuroIS research focus on? Inf Syst 89:101462
Xu Q, Varadarajan S, Chakrabarti C (2014) A distributed canny edge detector: algorithm and FPGA implementation. IEEE Trans Image Process 23(7):2944–2960. https://ieeexplore.ieee.org/abstract/document/6774938/
Yiu YH, Aboulatta M, Raiser T, Ophey L, Flanagin VL, Zu Eulenburg P, Ahmadi SA (2019) DeepVOG: Open-source pupil segmentation and gaze estimation in neuroscience using deep learning. J Neurosci Methods 324:108307. https://doi.org/10.1016/j.jneumeth.2019.05.016
Zhang C, Yao R, Cai J (2018) Efficient eye typing with 9-direction gaze estimation. Multim Tools Appl 77(15):19679–19696. https://doi.org/10.1007/s11042-017-5426-y
Zhang R, He S, Yang X, Wang X, Li K, Huang Q, Yu Z, Zhang X, Tang D, Li Y (2019) An EOG-based human-machine interface to control a smart home environment for patients with severe spinal cord injuries. IEEE Trans Biomed Eng 66(1):89–100. https://doi.org/10.1109/TBME.2018.2834555
Zhang X, Sugano Y, Fritz M, Bulling A (2015) Appearance-based gaze estimation in the wild. In: Proceedings of the IEEE computer society conference on computer vision and pattern recognition, pp 4511–4520. https://doi.org/10.1109/CVPR.2015.7299081
Zhang X, Sugano Y, Fritz M, Bulling A (2017) It’s written all over your face: full-face appearance-based gaze estimation. In: IEEE computer society conference on computer vision and pattern recognition workshops, pp 51–60. https://doi.org/10.1109/CVPRW.2017.284
Zhou X, Cai H, Shao Z, Yu H, Liu H (2016) 3D eye model-based gaze estimation from a depth sensor. In: 2016 IEEE international conference on robotics and biomimetics (ROBIO), pp 369–374
Zhuang Y, Zhang Y, Zhao H (2021) Appearance-based gaze estimation using separable convolution neural networks. In: 2021 IEEE 5th advanced information technology, electronic and automation control conference (IAEAC), vol 5, pp 609–612
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declare that they have no conflict of interest.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Modi, N., Singh, J. Real-time camera-based eye gaze tracking using convolutional neural network: a case study on social media website. Virtual Reality 26, 1489–1506 (2022). https://doi.org/10.1007/s10055-022-00642-6
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10055-022-00642-6