• Li J, Wang S, Wang G, Zhang J, Feng S, Xiao Y and Wu S. (2024). The effects of complex assembly task type and assembly experience on users’ demands for augmented reality instructions. The International Journal of Advanced Manufacturing Technology. 10.1007/s00170-024-13091-z. 131:3-4. (1479-1496). Online publication date: 1-Mar-2024.

    https://link.springer.com/10.1007/s00170-024-13091-z

  • Fuhl W, Weber D and Eivazi S. (2024). Pistol: Pupil Invisible Supportive Tool in the Wild. SN Computer Science. 10.1007/s42979-024-02606-w. 5:3.

    https://link.springer.com/10.1007/s42979-024-02606-w

  • Chang C, Hung J and Chang J. (2024). Exploring the Potential of Webcam-Based Eye-Tracking for Traditional Eye-Tracking Analysis. Frontier Computing on Industrial Applications Volume 4. 10.1007/978-981-99-9342-0_33. (313-316).

    https://link.springer.com/10.1007/978-981-99-9342-0_33

  • König J, Penaredondo J, McCullagh E, Bowen J and Hinze A. Let's Make it Accessible: The Challenges Of Working With Low-cost Commercially Available Wearable Devices. Proceedings of the 35th Australian Computer-Human Interaction Conference. (493-503).

    https://doi.org/10.1145/3638380.3638415

  • Jin K, Ren Y, Hou C, Liu X, Ye F, Hou M, Shi Z and Zhao J. Gaze distance detection by binocular tracking for automatic zoom glasses. Optical Engineering. 10.1117/1.OE.62.7.073101. 62:07.

    https://www.spiedigitallibrary.org/journals/optical-engineering/volume-62/issue-07/073101/Gaze-distance-detection-by-binocular-tracking-for-automatic-zoom-glasses/10.1117/1.OE.62.7.073101.full

  • Hosp B and Wahl S. ZING: An Eye-Tracking Experiment Software for Organization and Presentation of Omnidirectional Stimuli in Virtual Reality. Proceedings of the 2023 Symposium on Eye Tracking Research and Applications. (1-4).

    https://doi.org/10.1145/3588015.3589201

  • Capdevila M, Rodrigues K, Jardim C and Silva R. (2023). An Open Source Eye Gaze Tracker System to Perform Remote User Testing Evaluations. Intelligent Systems. 10.1007/978-3-031-45392-2_13. (192-207).

    https://link.springer.com/10.1007/978-3-031-45392-2_13

  • Iannizzotto G, Nucita A and Lo Bello L. (2022). Improving the Reader’s Attention and Focus through an AI-Driven Interactive and User-Aware Virtual Assistant for Handheld Devices. Applied System Innovation. 10.3390/asi5050092. 5:5. (92).

    https://www.mdpi.com/2571-5577/5/5/92

  • Kumar A, Anand J and Hemanth Kumar B. (2023). Intrusive video oculographic device: An eye-gaze-based device for communication. Innovation and Emerging Technologies. 10.1142/S2737599422500025. 09. Online publication date: 1-Jan-2022.

    https://www.worldscientific.com/doi/10.1142/S2737599422500025

  • Ratsamee P, Mae Y, Kamiyama K, Horade M, Kojima M and Arai T. (2021). Object segmentation in cluttered environment based on gaze tracing and gaze blinking. ROBOMECH Journal. 10.1186/s40648-021-00214-4. 8:1. Online publication date: 1-Dec-2021.

    https://robomechjournal.springeropen.com/articles/10.1186/s40648-021-00214-4

  • Wang Y, Lu S and Harter D. Multi-Sensor Eye-Tracking Systems and Tools for Capturing Student Attention and Understanding Engagement in Learning: A Review. IEEE Sensors Journal. 10.1109/JSEN.2021.3105706. 21:20. (22402-22413).

    https://ieeexplore.ieee.org/document/9516012/

  • Masopust L, Bauer D, Yao S and Ma K. (2021). A Comparison of the Fatigue Progression of Eye-Tracked and Motion-Controlled Interaction in Immersive Space 2021 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). 10.1109/ISMAR52148.2021.00063. 978-1-6654-0158-6. (460-469).

    https://ieeexplore.ieee.org/document/9583829/

  • Miron C, Grigoras L, Ciucu R and Manta V. (2022). Eye Image Segmentation Method Based on the Modified U-Net CNN Architecture. Bulletin of the Polytechnic Institute of Iași. Electrical Engineering, Power Engineering, Electronics Section. 10.2478/bipie-2021-0010. 67:2. (41-52). Online publication date: 1-Jun-2021.. Online publication date: 1-Jun-2021.

    https://www.sciendo.com/article/10.2478/bipie-2021-0010

  • Tang N, Fan J, Wang P and Shi G. (2021). Microscope integrated optical coherence tomography system combined with augmented reality. Optics Express. 10.1364/OE.420375. 29:6. (9407). Online publication date: 15-Mar-2021.

    https://opg.optica.org/abstract.cfm?URI=oe-29-6-9407

  • Ou W, Kuo T, Chang C and Fan C. (2021). Deep-Learning-Based Pupil Center Detection and Tracking Technology for Visible-Light Wearable Gaze Tracking Devices. Applied Sciences. 10.3390/app11020851. 11:2. (851).

    https://www.mdpi.com/2076-3417/11/2/851

  • Sun X and Balasingam B. Reading Line Classification Using Eye-Trackers. IEEE Transactions on Instrumentation and Measurement. 10.1109/TIM.2021.3094817. 70. (1-10).

    https://ieeexplore.ieee.org/document/9475049/

  • WANG Y, LU S and HARTER D. Eye Tracking and Learning Analytics for Promoting Proactive Teaching and Learning in Classroom: A Survey. Proceedings of the 2020 4th International Conference on Education and E-Learning. (156-160).

    https://doi.org/10.1145/3439147.3439161

  • Nam H, Hernandez I and Harmon B. Unmasked. Adjunct Proceedings of the 33rd Annual ACM Symposium on User Interface Software and Technology. (111-113).

    https://doi.org/10.1145/3379350.3416137

  • Pryluk R, Shohat Y, Morozov A, Friedman D, Taub A and Paz R. (2020). Shared yet dissociable neural codes across eye gaze, valence and expectation. Nature. 10.1038/s41586-020-2740-8. 586:7827. (95-100). Online publication date: 1-Oct-2020.

    https://www.nature.com/articles/s41586-020-2740-8

  • Costi A, Belk M, Fidas C, Constantinides A and Pitsillides A. CogniKit. Companion Proceedings of the 25th International Conference on Intelligent User Interfaces. (130-131).

    https://doi.org/10.1145/3379336.3381460

  • Hooge I, Holleman G, Haukes N and Hessels R. (2018). Gaze tracking accuracy in humans: One eye is sometimes better than two. Behavior Research Methods. 10.3758/s13428-018-1135-3. 51:6. (2712-2721). Online publication date: 1-Dec-2019.

    http://link.springer.com/10.3758/s13428-018-1135-3

  • Singh J and Modi N. (2019). Use of information modelling techniques to understand research trends in eye gaze estimation methods: An automated review. Heliyon. 10.1016/j.heliyon.2019.e03033. 5:12. (e03033). Online publication date: 1-Dec-2019.

    https://linkinghub.elsevier.com/retrieve/pii/S2405844019366927

  • Meena Y, Cecotti H, Wong-Lin K and Prasad G. (2019). Design and evaluation of a time adaptive multimodal virtual keyboard. Journal on Multimodal User Interfaces. 10.1007/s12193-019-00293-z. 13:4. (343-361). Online publication date: 1-Dec-2019.

    http://link.springer.com/10.1007/s12193-019-00293-z

  • Zhang X, Yuan S and Chen Y. (2019). WEYE: An open-source software system for recording and analyzing of eye- and mouse- tracking data from webpages 2019 IEEE 2nd International Conference on Knowledge Innovation and Invention (ICKII). 10.1109/ICKII46306.2019.9042756. 978-1-7281-0110-1. (233-236).

    https://ieeexplore.ieee.org/document/9042756/

  • Shakil A, Lutteroth C and Weber G. CodeGazer. Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. (1-12).

    https://doi.org/10.1145/3290605.3300306

  • Stuart S, Parrington L, Martini D, Popa B, Fino P and King L. (2019). Validation of a velocity-based algorithm to quantify saccades during walking and turning in mild traumatic brain injury and healthy controls. Physiological Measurement. 10.1088/1361-6579/ab159d. 40:4. (044006).

    https://iopscience.iop.org/article/10.1088/1361-6579/ab159d

  • Santamaria A, Raimondo P, Palmieri N, Tropea M and De Rango F. (2019). Cooperative Video-Surveillance Framework in Internet of Things (IoT) Domain. The Internet of Things for Smart Urban Ecosystems. 10.1007/978-3-319-96550-5_13. (305-331).

    http://link.springer.com/10.1007/978-3-319-96550-5_13

  • Ooms K and Krassanakis V. (2018). Measuring the Spatial Noise of a Low-Cost Eye Tracker to Enhance Fixation Detection. Journal of Imaging. 10.3390/jimaging4080096. 4:8. (96).

    http://www.mdpi.com/2313-433X/4/8/96

  • Gavas R, Roy S, Chatterjee D, Tripathy S, Chakravarty K, Sinha A and Kasneci E. (2018). Enhancing the usability of low-cost eye trackers for rehabilitation applications. PLOS ONE. 10.1371/journal.pone.0196348. 13:6. (e0196348).

    https://dx.plos.org/10.1371/journal.pone.0196348

  • OGAWA T, NAKAZAWA A and NISHIDA T. (2018). Point of Gaze Estimation Using Corneal Surface Reflection and Omnidirectional Camera Image. IEICE Transactions on Information and Systems. 10.1587/transinf.2017MVP0020. E101.D:5. (1278-1287). Online publication date: 1-May-2018.

    https://www.jstage.jst.go.jp/article/transinf/E101.D/5/E101.D_2017MVP0020/_article

  • Mompeán J, Aragón J, Prieto P and Artal P. (2018). Design of an accurate and high-speed binocular pupil tracking system based on GPGPUs. The Journal of Supercomputing. 74:5. (1836-1862). Online publication date: 1-May-2018.

    https://doi.org/10.1007/s11227-017-2193-5

  • Lander C, Speicher M, Kerber F and Krüger A. Towards Fixation Extraction in Corneal Imaging Based Eye Tracking Data. Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems. (1-6).

    https://doi.org/10.1145/3170427.3188597

  • Pfeiffer T. Gaze-Based Assistive Technologies. Smart Technologies. 10.4018/978-1-5225-2589-9.ch003. (44-66).

    http://services.igi-global.com/resolvedoi/resolve.aspx?doi=10.4018/978-1-5225-2589-9.ch003

  • Patil S, Jha R and Nigam A. (2017). IPSegNet : Deep Convolutional Neural Network Based Segmentation Framework for Iris and Pupil 2017 13th International Conference on Signal-Image Technology & Internet-Based Systems (SITIS). 10.1109/SITIS.2017.40. 978-1-5386-4283-2. (184-191).

    http://ieeexplore.ieee.org/document/8334745/

  • Ogai S and Tanaka T. (2017). A drag-and-drop type human computer interaction technique based on electrooculogram 2017 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA ASC). 10.1109/APSIPA.2017.8282126. 978-1-5386-1542-3. (716-720).

    http://ieeexplore.ieee.org/document/8282126/

  • Li T, Liu Q and Zhou X. Ultra-Low Power Gaze Tracking for Virtual Reality. Proceedings of the 15th ACM Conference on Embedded Network Sensor Systems. (1-14).

    https://doi.org/10.1145/3131672.3131682

  • Samara A, Galway L, Bond R and Wang H. (2017). Tracking and evaluation of pupil dilation via facial point marker analysis 2017 IEEE International Conference on Bioinformatics and Biomedicine (BIBM). 10.1109/BIBM.2017.8217974. 978-1-5090-3050-7. (2037-2043).

    http://ieeexplore.ieee.org/document/8217974/

  • Takemura K and Yamagishi K. A hybrid eye-tracking method using a multispectral camera. 2017 IEEE International Conference on Systems, Man, and Cybernetics (SMC). (1529-1534).

    https://doi.org/10.1109/SMC.2017.8122831

  • Gavas R, Chatterjee D and Sinha A. Estimation of cognitive load based on the pupil size dilation. 2017 IEEE International Conference on Systems, Man, and Cybernetics (SMC). (1499-1504).

    https://doi.org/10.1109/SMC.2017.8122826

  • Othman M, Amaral T, McNaney R, Smeddinck J, Vines J and Olivier P. CrowdEyes. Proceedings of the 19th International Conference on Human-Computer Interaction with Mobile Devices and Services. (1-13).

    https://doi.org/10.1145/3098279.3098559

  • Sogo H. (2016). Sgttoolbox: Utility for controlling SimpleGazeTracker from Psychtoolbox. Behavior Research Methods. 10.3758/s13428-016-0791-4. 49:4. (1323-1332). Online publication date: 1-Aug-2017.

    http://link.springer.com/10.3758/s13428-016-0791-4

  • Wu J, Ou W and Fan C. (2017). NIR-based gaze tracking with fast pupil ellipse fitting for real-time wearable eye trackers 2017 IEEE Conference on Dependable and Secure Computing. 10.1109/DESEC.2017.8073839. 978-1-5090-5569-2. (93-97).

    http://ieeexplore.ieee.org/document/8073839/

  • Liu C, Herrup K and Shi B. (2017). Remote gaze tracking system for 3D environments 2017 39th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC). 10.1109/EMBC.2017.8037186. 978-1-5090-2809-2. (1768-1771).

    https://ieeexplore.ieee.org/document/8037186/

  • Suefusa K and Tanaka T. (2017). A comparison study of visually stimulated brain–computer and eye-tracking interfaces. Journal of Neural Engineering. 10.1088/1741-2552/aa6086. 14:3. (036009). Online publication date: 1-Jun-2017.

    https://iopscience.iop.org/article/10.1088/1741-2552/aa6086

  • Santini T, Fuhl W and Kasneci E. CalibMe. Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. (2594-2605).

    https://doi.org/10.1145/3025453.3025950

  • Nakazawa A, Kato H, Nitschke C and Nishida T. (2017). Eye gaze tracking using corneal imaging and active illumination devices. Advanced Robotics. 10.1080/01691864.2016.1277552. 31:8. (413-427). Online publication date: 18-Apr-2017.

    https://www.tandfonline.com/doi/full/10.1080/01691864.2016.1277552

  • Dementyev A and Holz C. (2017). DualBlink. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies. 1:1. (1-19). Online publication date: 30-Mar-2017.

    https://doi.org/10.1145/3053330

  • Boldu R, Zhang H, Cortés J, Muthukumarana S and Nanayakkara S. InSight. Proceedings of the 8th Augmented Human International Conference. (1-5).

    https://doi.org/10.1145/3041164.3041195

  • Kar A and Corcoran P. A Review and Analysis of Eye-Gaze Estimation Systems, Algorithms and Performance Evaluation Methods in Consumer Platforms. IEEE Access. 10.1109/ACCESS.2017.2735633. 5. (16495-16519).

    http://ieeexplore.ieee.org/document/8003267/

  • Evans D and Fendley M. (2017). A multi-measure approach for connecting cognitive workload and automation. International Journal of Human-Computer Studies. 10.1016/j.ijhcs.2016.05.008. 97. (182-189). Online publication date: 1-Jan-2017.

    https://linkinghub.elsevier.com/retrieve/pii/S1071581916300623

  • Greenwald S, Loreti L, Funk M, Zilberman R and Maes P. Eye gaze tracking with google cardboard using purkinje images. Proceedings of the 22nd ACM Conference on Virtual Reality Software and Technology. (19-22).

    https://doi.org/10.1145/2993369.2993407

  • Lioulemes A, Papakostas M, Gieser S, Toutountzi T, Abujelala M, Gupta S, Collander C, Mcmurrough C and Makedon F. A Survey of Sensing Modalities for Human Activity, Behavior, and Physiological Monitoring. Proceedings of the 9th ACM International Conference on PErvasive Technologies Related to Assistive Environments. (1-8).

    https://doi.org/10.1145/2910674.2910711

  • Chinsatitf W and Saitoh T. (2016). Improvement of eye detection performance for inside-out camera 2016 IEEE/ACIS 15th International Conference on Computer and Information Science (ICIS). 10.1109/ICIS.2016.7550794. 978-1-5090-0806-3. (1-6).

    http://ieeexplore.ieee.org/document/7550794/

  • Adiba A, Tanaka N and Miyake J. An Adjustable Gaze Tracking System and Its Application for Automatic Discrimination of Interest Objects. IEEE/ASME Transactions on Mechatronics. 10.1109/TMECH.2015.2470522. 21:2. (973-979).

    http://ieeexplore.ieee.org/document/7210210/

  • Eivazi S, Bednarik R, Leinonen V, von und zu Fraunberg M and Jaaskelainen J. Embedding an Eye Tracker Into a Surgical Microscope: Requirements, Design, and Implementation. IEEE Sensors Journal. 10.1109/JSEN.2015.2501237. 16:7. (2070-2078).

    http://ieeexplore.ieee.org/document/7329925/

  • Pfeiffer T, Renner P and Pfeiffer-Leßmann N. EyeSee3D 2.0. Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications. (189-196).

    https://doi.org/10.1145/2857491.2857532

  • Chun J, Bae B and Jo S. (2016). BCI based hybrid interface for 3D object control in virtual reality 2016 4th International Winter Conference on Brain-Computer Interface (BCI). 10.1109/IWW-BCI.2016.7457461. 978-1-4673-7841-3. (1-4).

    http://ieeexplore.ieee.org/document/7457461/

  • Tuisku O, Rantanen V, Špakov O, Surakka V and Lekkala J. (2014). Pointing and Selecting with Facial Activity. Interacting with Computers. 10.1093/iwc/iwu026. 28:1. (1-12). Online publication date: 1-Jan-2016.

    https://academic.oup.com/iwc/article-lookup/doi/10.1093/iwc/iwu026

  • Xia L, Sheng B, Wu W, Ma L and Li P. (2016). Accurate gaze tracking from single camera using gabor corner detector. Multimedia Tools and Applications. 75:1. (221-239). Online publication date: 1-Jan-2016.

    https://doi.org/10.1007/s11042-014-2288-4

  • Rosa P, Esteves F and Arriaga P. Beyond Traditional Clinical Measurements for Screening Fears and Phobias. IEEE Transactions on Instrumentation and Measurement. 10.1109/TIM.2015.2450292. 64:12. (3396-3404).

    http://ieeexplore.ieee.org/document/7271039/

  • Ratsamee P, Mae Y, Kamiyama K, Horade M, Kojima M, Kiyokawa K, Mashita T, Kuroda Y, Takemura H and Arai T. (2015). Object search framework based on gaze interaction 2015 IEEE International Conference on Robotics and Biomimetics (ROBIO). 10.1109/ROBIO.2015.7419066. 978-1-4673-9675-2. (1997-2002).

    http://ieeexplore.ieee.org/document/7419066/

  • Wong H. Instantaneous and Robust Eye-Activity Based Task Analysis. Proceedings of the 2015 ACM on International Conference on Multimodal Interaction. (605-609).

    https://doi.org/10.1145/2818346.2823312

  • Karamchandani H, Chau T, Hobbs D and Mumford L. Development of a low-cost, portable, tablet-based eye tracking system for children with impairments. Proceedings of the international Convention on Rehabilitation Engineering & Assistive Technology. (1-4).

    /doi/10.5555/2846712.2846718

  • Mompean J, Aragon J, Prieto P and Artal P. GPU-Accelerated High-Speed Eye Pupil Tracking System. Proceedings of the 2015 27th International Symposium on Computer Architecture and High Performance Computing (SBAC-PAD). (17-24).

    https://doi.org/10.1109/SBAC-PAD.2015.17

  • Jalaliniya S, Mardanbegi D, Sintos I and Garcia D. EyeDroid. Adjunct Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2015 ACM International Symposium on Wearable Computers. (873-879).

    https://doi.org/10.1145/2800835.2804336

  • Discombe R and Cotterill S. (2015). Eye tracking in sport: A guide for new and aspiring researchers. Sport & Exercise Psychology Review. 10.53841/bpssepr.2015.11.2.49. 11:2. (49-58). Online publication date: 1-Sep-2015.

    https://explore.bps.org.uk/lookup/doi/10.53841/bpssepr.2015.11.2.49

  • Seagull F. (2015). Methods and Applications of Eye Tracking. The Cambridge Handbook of Applied Perception Research. 10.1017/CBO9780511973017.009. (60-78).

    https://www.cambridge.org/core/product/identifier/9780511973017%23c09640-5-1/type/book_part

  • Kim M, Kim B and Jo S. Quantitative Evaluation of a Low-Cost Noninvasive Hybrid Interface Based on EEG and Eye Movement. IEEE Transactions on Neural Systems and Rehabilitation Engineering. 10.1109/TNSRE.2014.2365834. 23:2. (159-168).

    https://ieeexplore.ieee.org/document/6948350/

  • Lopez-Basterretxea A, Mendez-Zorrilla A and Garcia-Zapirain B. (2015). Eye/Head Tracking Technology to Improve HCI with iPad Applications. Sensors. 10.3390/s150202244. 15:2. (2244-2264).

    https://www.mdpi.com/1424-8220/15/2/2244

  • Kim B and Jo S. (2015). Real-time motion artifact detection and removal for ambulatory BCI 2015 3rd International Winter Conference on Brain-Computer Interface (BCI). 10.1109/IWW-BCI.2015.7073050. 978-1-4799-7494-8. (1-4).

    http://ieeexplore.ieee.org/document/7073050/

  • Kim Y and Jo S. (2015). Wearable hybrid brain-computer interface for daily life application 2015 3rd International Winter Conference on Brain-Computer Interface (BCI). 10.1109/IWW-BCI.2015.7073029. 978-1-4799-7494-8. (1-4).

    http://ieeexplore.ieee.org/document/7073029/

  • Karakoc N, Karahan S and Akgul Y. (2015). Regressor Based Estimation of the Eye Pupil Center. Pattern Recognition. 10.1007/978-3-319-24947-6_40. (481-491).

    https://link.springer.com/10.1007/978-3-319-24947-6_40

  • Toivanen M and Lukander K. (2015). Improving Model-Based Mobile Gaze Tracking. Intelligent Decision Technologies. 10.1007/978-3-319-19857-6_52. (611-625).

    http://link.springer.com/10.1007/978-3-319-19857-6_52

  • Kim Y and Jo S. (2014). Wearable wireless interface based on brain activity and eye movement 2014 14th International Conference on Control, Automation and Systems (ICCAS). 10.1109/ICCAS.2014.6988004. 978-8-9932-1507-6. (286-289).

    http://ieeexplore.ieee.org/document/6988004/

  • Kassner M, Patera W and Bulling A. Pupil. Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication. (1151-1160).

    https://doi.org/10.1145/2638728.2641695

  • Wang X and Winslow B. (2014). Eye Tracking in Virtual Environments. Handbook of Virtual Environments. 10.1201/b17360-11. (197-210). Online publication date: 4-Sep-2014.

    http://www.crcnetbase.com/doi/abs/10.1201/b17360-11

  • Burton L, Albert W and Flynn M. (2014). A Comparison of the Performance of Webcam vs. Infrared Eye Tracking Technology. Proceedings of the Human Factors and Ergonomics Society Annual Meeting. 10.1177/1541931214581300. 58:1. (1437-1441). Online publication date: 1-Sep-2014.

    https://journals.sagepub.com/doi/10.1177/1541931214581300

  • Takemura K, Takahashi K, Takamatsu J and Ogasawara T. Estimating 3-D Point-of-Regard in a Real Environment Using a Head-Mounted Eye-Tracking System. IEEE Transactions on Human-Machine Systems. 10.1109/THMS.2014.2318324. 44:4. (531-536).

    http://ieeexplore.ieee.org/document/6814953/

  • Suefusa K and Tanaka T. (2014). Visually stimulated brain-computer interfaces compete with eye tracking interfaces when using small targets 2014 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC). 10.1109/EMBC.2014.6944502. 978-1-4244-7929-0. (4005-4008).

    http://ieeexplore.ieee.org/document/6944502/

  • Kim B, Kim M and Jo S. (2014). Quadcopter flight control using a low-cost hybrid interface with EEG-based classification and eye tracking. Computers in Biology and Medicine. 51. (82-92). Online publication date: 1-Aug-2014.

    https://doi.org/10.1016/j.compbiomed.2014.04.020

  • Mayberry A, Hu P, Marlin B, Salthouse C and Ganesan D. iShadow. Proceedings of the 12th annual international conference on Mobile systems, applications, and services. (82-94).

    https://doi.org/10.1145/2594368.2594388

  • Pfeiffer T and Renner P. EyeSee3D. Proceedings of the Symposium on Eye Tracking Research and Applications. (369-376).

    https://doi.org/10.1145/2578153.2628814

  • Pfeiffer T and Renner P. EyeSee3D. Proceedings of the Symposium on Eye Tracking Research and Applications. (195-202).

    https://doi.org/10.1145/2578153.2578183

  • Woods A, Holliman N, Favalora G, Li Q and Schonfeld D. (2014). General stereoscopic distortion rectification due to arbitrary viewer motion in binocular stereoscopic display IS&T/SPIE Electronic Imaging. 10.1117/12.2038282. . (90111Y). Online publication date: 6-Mar-2014.

    http://proceedings.spiedigitallibrary.org/proceeding.aspx?doi=10.1117/12.2038282

  • Li Y, Monaghan D and O'Connor N. Real-Time Gaze Estimation Using a Kinect and a HD Webcam. Proceedings of the 20th Anniversary International Conference on MultiMedia Modeling - Volume 8325. (506-517).

    https://doi.org/10.1007/978-3-319-04114-8_43

  • Pfeiffer T. (2014). Gaze-Based Assistive Technologies. Assistive Technologies and Computer Access for Motor Disabilities. 10.4018/978-1-4666-4438-0.ch004. (90-109).

    http://services.igi-global.com/resolvedoi/resolve.aspx?doi=10.4018/978-1-4666-4438-0.ch004

  • Zugal S and Pinggera J. (2014). Low–Cost Eye–Trackers: Useful for Information Systems Research?. Advanced Information Systems Engineering Workshops. 10.1007/978-3-319-07869-4_14. (159-170).

    http://link.springer.com/10.1007/978-3-319-07869-4_14

  • Lanatà A, Valenza G and Scilingo E. (2012). Eye gaze patterns in emotional pictures. Journal of Ambient Intelligence and Humanized Computing. 10.1007/s12652-012-0147-6. 4:6. (705-715). Online publication date: 1-Dec-2013.

    http://link.springer.com/10.1007/s12652-012-0147-6

  • Nitschke C, Nakazawa A and Nishida T. I See What You See. Proceedings of the 2013 2nd IAPR Asian Conference on Pattern Recognition. (298-304).

    https://doi.org/10.1109/ACPR.2013.84

  • Naugle E and Hoskinson R. (2013). Two gaze-detection methods for power reduction in near-to eye displays for wearable computing 2013 IEEE 9th International Conference on Wireless and Mobile Computing, Networking and Communications (WiMob). 10.1109/WiMOB.2013.6673429. 978-1-4799-0428-0. (675-680).

    http://ieeexplore.ieee.org/document/6673429/

  • Sogo H. (2012). GazeParser: an open-source and multiplatform library for low-cost eye tracking and analysis. Behavior Research Methods. 10.3758/s13428-012-0286-x. 45:3. (684-695). Online publication date: 1-Sep-2013.

    http://link.springer.com/10.3758/s13428-012-0286-x

  • de Bruin J, Malan K and Eloff J. Saccade deviation indicators for automated eye tracking analysis. Proceedings of the 2013 Conference on Eye Tracking South Africa. (47-54).

    https://doi.org/10.1145/2509315.2509324

  • Lukander K, Jagadeesan S, Chi H and Müller K. OMG!. Proceedings of the 15th international conference on Human-computer interaction with mobile devices and services. (408-411).

    https://doi.org/10.1145/2493190.2493214

  • Ram S and Kalwad P. (2013). Computer interaction based on voluntary ocular motility for the physically challenged 2013 IEEE Global Humanitarian Technology Conference: South Asia Satellite (GHTC-SAS). 10.1109/GHTC-SAS.2013.6629914. 978-1-4799-1095-3. (191-195).

    http://ieeexplore.ieee.org/document/6629914/

  • Minho Kim , Yongwook Chae and Sungho Jo . (2013). Hybrid EEG and eye movement interface to multi-directional target selection 2013 35th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC). 10.1109/EMBC.2013.6609612. 978-1-4577-0216-7. (763-766).

    http://ieeexplore.ieee.org/document/6609612/

  • McMurrough C, Ranatunga I, Papangelis A, Popa D and Makedon F. A development and evaluation platform for non-tactile power wheelchair controls. Proceedings of the 6th International Conference on PErvasive Technologies Related to Assistive Environments. (1-4).

    https://doi.org/10.1145/2504335.2504339

  • Matthies D. InEar BioFeedController. CHI '13 Extended Abstracts on Human Factors in Computing Systems. (1293-1298).

    https://doi.org/10.1145/2468356.2468587

  • Chen S, Epps J and Chen F. Automatic and continuous user task analysis via eye activity. Proceedings of the 2013 international conference on Intelligent user interfaces. (57-66).

    https://doi.org/10.1145/2449396.2449406

  • Armato A, Lanatà A and Scilingo E. (2013). Comparitive study on photometric normalization algorithms for an innovative, robust and real-time eye gaze tracker. Journal of Real-Time Image Processing. 8:1. (21-33). Online publication date: 1-Mar-2013.

    https://doi.org/10.1007/s11554-011-0217-6

  • Goggins S, Schmidt M, Guajardo J and Moore J. (2013). 3D Virtual Worlds. Integrations of Technology Utilization and Social Dynamics in Organizations. 10.4018/978-1-4666-1948-7.ch012. (194-213).

    http://services.igi-global.com/resolvedoi/resolve.aspx?doi=10.4018/978-1-4666-1948-7.ch012

  • Tuisku O, Surakka V, Rantanen V, Vanhala T and Lekkala J. (2013). Text entry by gazing and smiling. Advances in Human-Computer Interaction. 2013. (1-1). Online publication date: 1-Jan-2013.

    https://doi.org/10.1155/2013/218084

  • Jillela R, Ross A, Boddeti V, Kumar B, Hu X, Plemmons R and Pauca P. (2013). Iris Segmentation for Challenging Periocular Images. Handbook of Iris Recognition. 10.1007/978-1-4471-4402-1_14. (281-308).

    https://link.springer.com/10.1007/978-1-4471-4402-1_14

  • McMurrough C, Rich J, Conly C, Athitsos V and Makedon F. Multi-modal object of interest detection using eye gaze and RGB-D cameras. Proceedings of the 4th Workshop on Eye Gaze in Intelligent Human Machine Interaction. (1-6).

    https://doi.org/10.1145/2401836.2401838

  • McMurrough C. Multi-modal interfaces for control of assistive robotic devices. Proceedings of the 14th ACM international conference on Multimodal interaction. (329-332).

    https://doi.org/10.1145/2388676.2388749

  • Alt F, Shirazi A, Schmidt A and Mennenöh J. Increasing the user's attention on the web. Proceedings of the 7th Nordic Conference on Human-Computer Interaction: Making Sense Through Design. (544-553).

    https://doi.org/10.1145/2399016.2399099

  • Kanade T and Hebert M. First-Person Vision. Proceedings of the IEEE. 10.1109/JPROC.2012.2200554. 100:8. (2442-2453).

    http://ieeexplore.ieee.org/document/6232429/

  • McMurrough C, Rich J, Metsis V, Nguyen A and Makedon F. Low-cost head position tracking for gaze point estimation. Proceedings of the 5th International Conference on PErvasive Technologies Related to Assistive Environments. (1-4).

    https://doi.org/10.1145/2413097.2413125

  • Rantanen V, Verho J, Lekkala J, Tuisku O, Surakka V and Vanhala T. The effect of clicking by smiling on the accuracy of head-mounted gaze tracking. Proceedings of the Symposium on Eye Tracking Research and Applications. (345-348).

    https://doi.org/10.1145/2168556.2168633

  • Verma S, Pillai P and Hu Y. (2012). Development of an eye-tracking control system using AForge.NET framework. International Journal of Intelligent Systems Technologies and Applications. 11:3/4. (286-303). Online publication date: 1-Mar-2012.

    https://doi.org/10.1504/IJISTA.2012.052485

  • Mantiuk R, Kowalik M, Nowosielski A and Bazyluk B. Do-It-yourself eye tracker. Proceedings of the 18th international conference on Advances in Multimedia Modeling. (115-125).

    https://doi.org/10.1007/978-3-642-27355-1_13

  • Villanueva A, Cabeza R and San Agustin J. Gaze Estimation. Gaze Interaction and Applications of Eye Tracking. 10.4018/978-1-61350-098-9.ch021. (310-325).

    http://services.igi-global.com/resolvedoi/resolve.aspx?doi=10.4018/978-1-61350-098-9.ch021

  • Majaranta P and Donegan M. Introduction to Gaze Interaction. Gaze Interaction and Applications of Eye Tracking. 10.4018/978-1-61350-098-9.ch001. (1-9).

    http://services.igi-global.com/resolvedoi/resolve.aspx?doi=10.4018/978-1-61350-098-9.ch001

  • Rozado D, Agustin J, Rodriguez F and Varona P. (2012). Gliding and saccadic gaze gesture recognition in real time. ACM Transactions on Interactive Intelligent Systems. 1:2. (1-27). Online publication date: 1-Jan-2012.

    https://doi.org/10.1145/2070719.2070723

  • Hu X, Pauca V and Plemmons R. Iterative directional ray-based iris segmentation for challenging periocular images. Proceedings of the 6th Chinese conference on Biometric recognition. (91-99).

    /doi/10.5555/2074627.2074641

  • Tsukada A, Shino M, Devyver M and Kanade T. (2011). Illumination-free gaze estimation method for first-person vision wearable device 2011 IEEE International Conference on Computer Vision Workshops (ICCV Workshops). 10.1109/ICCVW.2011.6130505. 978-1-4673-0063-6. (2084-2091).

    http://ieeexplore.ieee.org/document/6130505/

  • Tuisku O, Surakka V, Gizatdinova Y, Vanhala T, Rantanen V, Verho J and Lekkala J. Gazing and Frowning to Computers Can Be Enjoyable. Proceedings of the 2011 Third International Conference on Knowledge and Systems Engineering. (211-218).

    https://doi.org/10.1109/KSE.2011.41

  • Buckley M, Vaidyanathan R and Mayol-Cuevas W. Sensor suites for assistive arm prosthetics. Proceedings of the 2011 24th International Symposium on Computer-Based Medical Systems. (1-6).

    https://doi.org/10.1109/CBMS.2011.5999153

  • Schneider N, Bex P, Barth E and Dorr M. An open-source low-cost eye-tracking system for portable real-time and offline tracking. Proceedings of the 1st Conference on Novel Gaze-Controlled Applications. (1-4).

    https://doi.org/10.1145/1983302.1983310

  • Skovsgaard H, Agustin J, Johansen S, Hansen J and Tall M. Evaluation of a remote webcam-based eye tracker. Proceedings of the 1st Conference on Novel Gaze-Controlled Applications. (1-4).

    https://doi.org/10.1145/1983302.1983309

  • Sigut J and Sidha S. Iris Center Corneal Reflection Method for Gaze Tracking Using Visible Light. IEEE Transactions on Biomedical Engineering. 10.1109/TBME.2010.2087330. 58:2. (411-419).

    http://ieeexplore.ieee.org/document/5601753/

  • Oikawa A, Muro T and Miki N. (2011). Wearable Line-of-Sight Detection System Using Transparent Optical Sensor Arrays. Journal of the Robotics Society of Japan. 10.7210/jrsj.29.369. 29:4. (369-375).

    http://www.jstage.jst.go.jp/article/jrsj/29/4/29_4_369/_article/-char/ja/

  • Goggins S, Schmidt M, Guajardo J and Moore J. (2011). 3D Virtual Worlds. International Journal of Social and Organizational Dynamics in IT. 10.4018/ijsodit.2011010103. 1:1. (30-48). Online publication date: 1-Jan-2011.

    https://services.igi-global.com/resolvedoi/resolve.aspx?doi=10.4018/ijsodit.2011010103

  • Miluzzo E, Wang T and Campbell A. EyePhone. Proceedings of the second ACM SIGCOMM workshop on Networking, systems, and applications on mobile handhelds. (15-20).

    https://doi.org/10.1145/1851322.1851328

  • Fabian T, Gaura J and Kotas P. (2010). An algorithm for iris extraction 2010 2nd International Conference on Image Processing Theory, Tools and Applications (IPTA). 10.1109/IPTA.2010.5586756. 978-1-4244-7247-5. (464-468).

    http://ieeexplore.ieee.org/document/5586756/

  • Ishiguro Y, Mujibiya A, Miyaki T and Rekimoto J. Aided eyes. Proceedings of the 1st Augmented Human International Conference. (1-7).

    https://doi.org/10.1145/1785455.1785480

  • Ryan W, Duchowski A, Vincent E and Battisto D. Match-moving for area-based analysis of eye movements in natural tasks. Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications. (235-242).

    https://doi.org/10.1145/1743666.1743722

  • Takemura K, Kohashi Y, Suenaga T, Takamatsu J and Ogasawara T. Estimating 3D point-of-regard and visualizing gaze trajectories under natural head movements. Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications. (157-160).

    https://doi.org/10.1145/1743666.1743705

  • Wästlund E, Sponseller K and Pettersson O. What you see is where you go. Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications. (133-136).

    https://doi.org/10.1145/1743666.1743699

  • Hennessey C and Duchowski A. An open source eye-gaze interface. Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications. (81-84).

    https://doi.org/10.1145/1743666.1743686

  • Goggins S, Schmidt M, Guajardo J and Moore J. Assessing Multiple Perspectives in Three Dimensional Virtual Worlds. Proceedings of the 2010 43rd Hawaii International Conference on System Sciences. (1-10).

    https://doi.org/10.1109/HICSS.2010.71

  • Schiavone G, Campolo D, Keller F and Guglielmelli E. Calibration of a multimodal head-mounted device for ecological assessment of social orienting behavior in children. Proceedings of the 2009 IEEE/RSJ international conference on Intelligent robots and systems. (1031-1036).

    /doi/10.5555/1733343.1733539

  • Schiavone G, Campolo D, Keller F and Guglielmelli E. (2009). Calibration of a multimodal head-mounted device for ecological assessment of social orienting behavior in children 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2009). 10.1109/IROS.2009.5354254. 978-1-4244-3803-7. (1031-1036).

    http://ieeexplore.ieee.org/document/5354254/

  • Munn S and Pelz J. (2009). FixTag: An algorithm for identifying and tagging fixations to simplify the analysis of data collected by portable eye trackers. ACM Transactions on Applied Perception. 6:3. (1-25). Online publication date: 1-Aug-2009.

    https://doi.org/10.1145/1577755.1577759

  • Schneider E, Villgrattner T, Vockeroth J, Bartl K, Kohlbecher S, Bardins S, Ulbrich H and Brandt T. (2009). EyeSeeCam: An Eye Movement–Driven Head Camera for the Examination of Natural Visual Exploration. Annals of the New York Academy of Sciences. 10.1111/j.1749-6632.2009.03858.x. 1164:1. (461-467). Online publication date: 1-May-2009.

    https://nyaspubs.onlinelibrary.wiley.com/doi/10.1111/j.1749-6632.2009.03858.x

  • Crisafulli G, Iannizzotto G and La Rosa F. (2009). Two competitive solutions to the problem of remote eye-tracking 2009 2nd Conference on Human System Interactions (HSI). 10.1109/HSI.2009.5091005. 978-1-4244-3959-1. (356-362).

    http://ieeexplore.ieee.org/document/5091005/

  • Pichiliani M, Hirata C, Soares F and Forster C. (2009). TeleEye: An Awareness Widget for Providing the Focus of Attention in Collaborative Editing Systems. Collaborative Computing: Networking, Applications and Worksharing. 10.1007/978-3-642-03354-4_20. (258-270).

    http://link.springer.com/10.1007/978-3-642-03354-4_20

  • Pichiliani M, Hirata C, Soares F and Forster C. Utilização do Rastreamento Ocular para Visualização do Local de Atenção em Sistemas de Edição Colaborativos. Proceedings of the 2008 Simpósio Brasileiro de Sistemas Colaborativos. (169-179).

    https://doi.org/10.1109/SBSC.2008.10

  • Quek F, Ehrich R and Lockhart T. As go the feet.... Proceedings of the 10th international conference on Multimodal interfaces. (97-104).

    https://doi.org/10.1145/1452392.1452412

  • Topal C, Dogan A and Gerek O. (2008). A wearable head-mounted sensor-based apparatus for eye tracking applications 2008 IEEE Conference on Virtual Environments, Human-Computer Interfaces and Measurement Systems. 10.1109/VECIMS.2008.4592768. 978-1-4244-1927-2. (136-139).

    https://ieeexplore.ieee.org/document/4592768/

  • Marra S and Pirri F. Eyes and cameras calibration for 3D world gaze detection. Proceedings of the 6th international conference on Computer vision systems. (216-227).

    /doi/10.5555/1788524.1788549

  • Heidenburg B, Lenisa M, Wentzel D and Malinowski A. (2008). Data mining for gaze tracking system 2008 Conference on Human System Interactions (HSI). 10.1109/HSI.2008.4581522. 978-1-4244-1542-7. (680-683).

    http://ieeexplore.ieee.org/document/4581522/

  • Topal C, Dogan A and Gerek O. (2008). An eye-glasses-like wearable eye gaze tracking system 2008 IEEE 16th Signal Processing, Communication and Applications Conference (SIU). 10.1109/SIU.2008.4632568. 978-1-4244-1998-2. (1-4).

    https://ieeexplore.ieee.org/document/4632568/

  • Hansen D, Skovsgaard H, Hansen J and Møllenbach E. Noise tolerant selection by gaze-controlled pan and zoom in 3D. Proceedings of the 2008 symposium on Eye tracking research & applications. (205-212).

    https://doi.org/10.1145/1344471.1344521

  • Yun Z, Xin-Bo Z, Rong-Chun Z, Yuan Z and Xiao-Chun Z. EyeSecret. Proceedings of the 2008 symposium on Eye tracking research & applications. (103-106).

    https://doi.org/10.1145/1344471.1344498

  • Topal C, Gerek Ö and Doǧan A. A head-mounted sensor-based eye tracking device. Proceedings of the 2008 symposium on Eye tracking research & applications. (87-90).

    https://doi.org/10.1145/1344471.1344494

  • Ryan W, Duchowski A and Birchfield S. Limbus/pupil switching for wearable eye tracking under variable lighting conditions. Proceedings of the 2008 symposium on Eye tracking research & applications. (61-64).

    https://doi.org/10.1145/1344471.1344487

  • Marra S and Pirri F. Eyes and Cameras Calibration for 3D World Gaze Detection. Computer Vision Systems. 10.1007/978-3-540-79547-6_21. (216-227).

    http://link.springer.com/10.1007/978-3-540-79547-6_21

  • Yonezawa T, Yamazoe H, Utsumi A and Abe S. Gazecoppet. ACM SIGGRAPH 2007 posters. (160-es).

    https://doi.org/10.1145/1280720.1280894

  • Piccardi L, Noris B, Barbey O, Billard A, Schiavone G, Keller F and von Hofsten C. (2007). WearCam: A head mounted wireless camera for monitoring gaze attention and for the diagnosis of developmental disorders in young children RO-MAN 2007 - The 16th IEEE International Symposium on Robot and Human Interactive Communication. 10.1109/ROMAN.2007.4415154. 978-1-4244-1634-9. (594-598).

    http://ieeexplore.ieee.org/document/4415154/