• Wang Q, Ma X, Lu Y, Wang D and Sun Y. (2024). The impact of visual and motor space size on gaze-based target selection. Australian Journal of Psychology. 10.1080/00049530.2024.2309384. 76:1. Online publication date: 31-Dec-2025.

    https://www.tandfonline.com/doi/full/10.1080/00049530.2024.2309384

  • Hou B, Newn J, Sidenmark L, Khan A and Gellersen H. (2024). GazeSwitch: Automatic Eye-Head Mode Switching for Optimised Hands-Free Pointing. Proceedings of the ACM on Human-Computer Interaction. 8:ETRA. (1-20). Online publication date: 20-May-2024.

    https://doi.org/10.1145/3655601

  • Hamid A and Kristensson P. (2024). 40 Years of Eye Typing: Challenges, Gaps, and Emergent Strategies. Proceedings of the ACM on Human-Computer Interaction. 8:ETRA. (1-19). Online publication date: 20-May-2024.

    https://doi.org/10.1145/3655596

  • Dondi P, Sapuppo S and Porta M. (2024). Leyenes. International Journal of Human-Computer Studies. 184:C. Online publication date: 1-Apr-2024.

    https://doi.org/10.1016/j.ijhcs.2023.103204

  • Hu J, Dudley J and Kristensson P. (2024). SkiMR: Dwell-free Eye Typing in Mixed Reality 2024 IEEE Conference Virtual Reality and 3D User Interfaces (VR). 10.1109/VR58804.2024.00065. 979-8-3503-7402-5. (439-449).

    https://ieeexplore.ieee.org/document/10494160/

  • Zeng Z, Wang X, Siebert F and Liu H. (2023). Enhancing Hybrid Eye Typing Interfaces with Word and Letter Prediction: A Comprehensive Evaluation. International Journal of Human–Computer Interaction. 10.1080/10447318.2023.2297113. (1-13).

    https://www.tandfonline.com/doi/full/10.1080/10447318.2023.2297113

  • Pasini N, Mariani A, Deguet A, Kazanzides P and De Momi E. GRACE: Online Gesture Recognition for Autonomous Camera-Motion Enhancement in Robot-Assisted Surgery. IEEE Robotics and Automation Letters. 10.1109/LRA.2023.3326690. 8:12. (8263-8270).

    https://ieeexplore.ieee.org/document/10290934/

  • Zeng Z, Neuer E, Roetting M and Siebert F. (2022). A One-Point Calibration Design for Hybrid Eye Typing Interface. International Journal of Human–Computer Interaction. 10.1080/10447318.2022.2101186. 39:18. (3620-3633). Online publication date: 8-Nov-2023.

    https://www.tandfonline.com/doi/full/10.1080/10447318.2022.2101186

  • Mutasim A, Batmaz A, Hudhud Mughrabi M and Stuerzlinger W. Performance Analysis of Saccades for Primary and Confirmatory Target Selection. Proceedings of the 28th ACM Symposium on Virtual Reality Software and Technology. (1-12).

    https://doi.org/10.1145/3562939.3565619

  • Matulewski J, Bałaj B, Mościchowska I, Ignaczewska A, Linowiecki R, Dreszer J and Duch W. (2022). Learnability evaluation of the markup language for designing applications controlled by gaze. International Journal of Human-Computer Studies. 165:C. Online publication date: 1-Sep-2022.

    https://doi.org/10.1016/j.ijhcs.2022.102863

  • Matulewski J and Patera M. Usability of the super-vowel for gaze-based text entry. 2022 Symposium on Eye Tracking Research and Applications. (1-5).

    https://doi.org/10.1145/3517031.3529231

  • Rajanna V, Russel M, Zhao J and Hammond T. (2022). PressTapFlick. International Journal of Human-Computer Studies. 161:C. Online publication date: 1-May-2022.

    https://doi.org/10.1016/j.ijhcs.2022.102787

  • Porta M, Dondi P, Pianetta A and Cantoni V. SPEye: A Calibration-Free Gaze-Driven Text Entry Technique Based on Smooth Pursuit . IEEE Transactions on Human-Machine Systems. 10.1109/THMS.2021.3123202. 52:2. (312-323).

    https://ieeexplore.ieee.org/document/9619860/

  • Hossain T, Islam M, Delamare W, Chowdhury F and Hasan K. Exploring Social Acceptability and Users’ Preferences of Head- and Eye-Based Interaction with Mobile Devices. Proceedings of the 20th International Conference on Mobile and Ubiquitous Multimedia. (12-23).

    https://doi.org/10.1145/3490632.3490636

  • Hwang I, Tsai Y, Zeng B, Lin C, Shiue H and Chang G. (2020). Integration of eye tracking and lip motion for hands-free computer access. Universal Access in the Information Society. 20:2. (405-416). Online publication date: 1-Jun-2021.

    https://doi.org/10.1007/s10209-020-00723-w

  • Lucas N and Pandya A. (2021). Multirobot Confidence and Behavior Modeling: An Evaluation of Semiautonomous Task Performance and Efficiency. Robotics. 10.3390/robotics10020071. 10:2. (71).

    https://www.mdpi.com/2218-6581/10/2/71

  • Col T, Mariani A, Deguet A, Menciassi A, Kazanzides P and De Momi E. (2020). SCAN: System for Camera Autonomous Navigation in Robotic-Assisted Surgery 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). 10.1109/IROS45743.2020.9341548. 978-1-7281-6212-6. (2996-3002).

    https://ieeexplore.ieee.org/document/9341548/

  • Kurauchi A, Feng W, Joshi A, Morimoto C and Betke M. Swipe&Switch. Adjunct Proceedings of the 33rd Annual ACM Symposium on User Interface Software and Technology. (84-86).

    https://doi.org/10.1145/3379350.3416193

  • Matulewski J and Patera M. Comparison of three dwell-time-based gaze text entry methods. ACM Symposium on Eye Tracking Research and Applications. (1-5).

    https://doi.org/10.1145/3379157.3388931

  • Choi M, Sakamoto D and Ono T. Bubble Gaze Cursor + Bubble Gaze Lens: Applying Area Cursor Technique to Eye-Gaze Interface. ACM Symposium on Eye Tracking Research and Applications. (1-10).

    https://doi.org/10.1145/3379155.3391322

  • Creed C, Frutos-Pascual M and Williams I. Multimodal Gaze Interaction for Creative Design. Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. (1-13).

    https://doi.org/10.1145/3313831.3376196

  • Cui J, Niu Y, Xue C, Cai X, Xie Y, Shi B and Qiu L. (2020). Eye Control System Development and Research of Effects of Color of Icons on Visual Search Performance Based on the System. Intelligent Human Systems Integration 2020. 10.1007/978-3-030-39512-4_186. (1219-1224).

    http://link.springer.com/10.1007/978-3-030-39512-4_186

  • Khan M and Lee S. (2019). Gaze and Eye Tracking: Techniques and Applications in ADAS. Sensors. 10.3390/s19245540. 19:24. (5540).

    https://www.mdpi.com/1424-8220/19/24/5540

  • Meena Y, Cecotti H, Wong-Lin K and Prasad G. (2019). Design and evaluation of a time adaptive multimodal virtual keyboard. Journal on Multimodal User Interfaces. 10.1007/s12193-019-00293-z. 13:4. (343-361). Online publication date: 1-Dec-2019.

    http://link.springer.com/10.1007/s12193-019-00293-z

  • Jaber R, McMillan D, Belenguer J and Brown B. Patterns of gaze in speech agent interaction. Proceedings of the 1st International Conference on Conversational User Interfaces. (1-10).

    https://doi.org/10.1145/3342775.3342791

  • Mardanbegi D, Langlotz T and Gellersen H. Resolving Target Ambiguity in 3D Gaze Interaction through VOR Depth Estimation. Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. (1-12).

    https://doi.org/10.1145/3290605.3300842

  • Shakil A, Lutteroth C and Weber G. CodeGazer. Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. (1-12).

    https://doi.org/10.1145/3290605.3300306

  • Satriadi K, Ens B, Cordeil M, Jenny B, Czauderna T and Willett W. (2019). Augmented Reality Map Navigation with Freehand Gestures 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR). 10.1109/VR.2019.8798340. 978-1-7281-1377-7. (593-603).

    https://ieeexplore.ieee.org/document/8798340/

  • Matulewski J, Bałaj B, Marek E, Piasecki Ł, Gruszczyński D, Kuchta M and Duch W. Moveye. Proceedings of the Workshop on Communication by Gaze Interaction. (1-5).

    https://doi.org/10.1145/3206343.3206352

  • Rajanna V and Hammond T. A Fitts' law evaluation of gaze input on large displays compared to touch and mouse inputs. Proceedings of the Workshop on Communication by Gaze Interaction. (1-5).

    https://doi.org/10.1145/3206343.3206348

  • Morimoto C, Leyva J and Diaz-Tula A. Context switching eye typing using dynamic expanding targets. Proceedings of the Workshop on Communication by Gaze Interaction. (1-9).

    https://doi.org/10.1145/3206343.3206347

  • Pfeiffer T. Gaze-Based Assistive Technologies. Smart Technologies. 10.4018/978-1-5225-2589-9.ch003. (44-66).

    http://services.igi-global.com/resolvedoi/resolve.aspx?doi=10.4018/978-1-5225-2589-9.ch003

  • Mott M, Williams S, Wobbrock J and Morris M. Improving Dwell-Based Gaze Typing with Dynamic, Cascading Dwell Times. Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. (2558-2570).

    https://doi.org/10.1145/3025453.3025517

  • Moseley M. The use of technology to provide physical interaction experiences for cognitively able young people who have complex physical disabilities. Proceedings of the 30th International BCS Human Computer Interaction Conference: Fusion!. (1-6).

    https://doi.org/10.14236/ewic/HCI2016.11

  • Rodger S, Vines J and McLaughlin J. Technology and the Politics of Mobility. Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. (2417-2429).

    https://doi.org/10.1145/2858036.2858146

  • Klamka K, Siegel A, Vogt S, Göbel F, Stellmach S and Dachselt R. Look & Pedal. Proceedings of the 2015 ACM on International Conference on Multimodal Interaction. (123-130).

    https://doi.org/10.1145/2818346.2820751

  • Ghazali K, Jadin M, Jie M and Xiao R. (2015). Novel automatic eye detection and tracking algorithm. Optics and Lasers in Engineering. 10.1016/j.optlaseng.2014.11.003. 67. (49-56). Online publication date: 1-Apr-2015.

    https://linkinghub.elsevier.com/retrieve/pii/S0143816614002644

  • Porta M. (2015). (2015). A study on text entry methods based on eye gestures. Journal of Assistive Technologies. 10.1108/JAT-12-2013-0037. 9:1. (48-67). Online publication date: 16-Mar-2015.. Online publication date: 16-Mar-2015.

    https://www.emerald.com/insight/content/doi/10.1108/JAT-12-2013-0037/full/html

  • Tula A and Morimoto C. Meta-keys. Proceedings of the 13th Brazilian Symposium on Human Factors in Computing Systems. (285-292).

    /doi/10.5555/2738055.2738101

  • Pandya A, Reisner L, King B, Lucas N, Composto A, Klein M and Ellis R. (2014). A Review of Camera Viewpoint Automation in Robotic and Laparoscopic Surgery. Robotics. 10.3390/robotics3030310. 3:3. (310-329).

    https://www.mdpi.com/2218-6581/3/3/310

  • Pfeiffer T. (2014). Gaze-Based Assistive Technologies. Assistive Technologies and Computer Access for Motor Disabilities. 10.4018/978-1-4666-4438-0.ch004. (90-109).

    http://services.igi-global.com/resolvedoi/resolve.aspx?doi=10.4018/978-1-4666-4438-0.ch004

  • Majaranta P and Bulling A. (2014). Eye Tracking and Eye-Based Human–Computer Interaction. Advances in Physiological Computing. 10.1007/978-1-4471-6392-3_3. (39-65).

    https://link.springer.com/10.1007/978-1-4471-6392-3_3

  • Vidal M, Bulling A and Gellersen H. Pursuits. Proceedings of the 2013 ACM international joint conference on Pervasive and ubiquitous computing. (439-448).

    https://doi.org/10.1145/2493432.2493477

  • Bacim F, Kopper R and Bowman D. (2013). Design and evaluation of 3D selection techniques based on progressive refinement. International Journal of Human-Computer Studies. 71:7-8. (785-802). Online publication date: 1-Jul-2013.

    https://doi.org/10.1016/j.ijhcs.2013.03.003

  • Špakov O and Majaranta P. Enhanced gaze interaction using simple head gestures. Proceedings of the 2012 ACM Conference on Ubiquitous Computing. (705-710).

    https://doi.org/10.1145/2370216.2370369

  • Stellmach S and Dachselt R. Look & touch. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. (2981-2990).

    https://doi.org/10.1145/2207676.2208709

  • Nielsen A, Petersen A and Hansen J. Gaming with gaze and losing with a smile. Proceedings of the Symposium on Eye Tracking Research and Applications. (365-368).

    https://doi.org/10.1145/2168556.2168638

  • Stellmach S and Dachselt R. Investigating gaze-supported multimodal pan and zoom. Proceedings of the Symposium on Eye Tracking Research and Applications. (357-360).

    https://doi.org/10.1145/2168556.2168636

  • Tula A, de Campos F and Morimoto C. Dynamic context switching for gaze based interaction. Proceedings of the Symposium on Eye Tracking Research and Applications. (353-356).

    https://doi.org/10.1145/2168556.2168635

  • Zhao X, Guestrin E, Sayenko D, Simpson T, Gauthier M and Popovic M. Typing with eye-gaze and tooth-clicks. Proceedings of the Symposium on Eye Tracking Research and Applications. (341-344).

    https://doi.org/10.1145/2168556.2168632

  • Stellmach S and Dachselt R. Designing gaze-based user interfaces for steering in virtual environments. Proceedings of the Symposium on Eye Tracking Research and Applications. (131-138).

    https://doi.org/10.1145/2168556.2168577

  • Majaranta P. Communication and Text Entry by Gaze. Gaze Interaction and Applications of Eye Tracking. 10.4018/978-1-61350-098-9.ch008. (63-77).

    http://services.igi-global.com/resolvedoi/resolve.aspx?doi=10.4018/978-1-61350-098-9.ch008

  • Skovsgaard H, Mateo J and Hansen J. (2011). Evaluating gaze-based interface tools to facilitate point-and-select tasks with small targets. Behaviour & Information Technology. 10.1080/0144929X.2011.563801. 30:6. (821-831). Online publication date: 1-Nov-2011.

    http://www.tandfonline.com/doi/abs/10.1080/0144929X.2011.563801

  • Hansen J, Agustin J and Skovsgaard H. Gaze interaction from bed. Proceedings of the 1st Conference on Novel Gaze-Controlled Applications. (1-4).

    https://doi.org/10.1145/1983302.1983313

  • Stellmach S, Stober S, Nürnberger A and Dachselt R. Designing gaze-supported multimodal interactions for the exploration of large image collections. Proceedings of the 1st Conference on Novel Gaze-Controlled Applications. (1-8).

    https://doi.org/10.1145/1983302.1983303

  • Zhu D, Gedeon T and Taylor K. Exploring camera viewpoint control models for a multi-tasking setting in teleoperation. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. (53-62).

    https://doi.org/10.1145/1978942.1978952

  • Lin D, Le V and Huang T. (2011). Human–Computer Interaction. Visual Analysis of Humans. 10.1007/978-0-85729-997-0_25. (493-510).

    https://link.springer.com/10.1007/978-0-85729-997-0_25

  • Urbina M and Huckauf A. Alternatives to single character entry and dwell time selection on eye typing. Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications. (315-322).

    https://doi.org/10.1145/1743666.1743738

  • Morimoto C and Amir A. Context switching for fast key selection in text entry applications. Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications. (271-274).

    https://doi.org/10.1145/1743666.1743730

  • Skovsgaard H, Mateo J, Flach J and Hansen J. Small-target selection with gaze alone. Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications. (145-148).

    https://doi.org/10.1145/1743666.1743702

  • Hennessey C and Duchowski A. An open source eye-gaze interface. Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications. (81-84).

    https://doi.org/10.1145/1743666.1743686

  • San Agustin J, Skovsgaard H, Mollenbach E, Barret M, Tall M, Hansen D and Hansen J. Evaluation of a low-cost open-source gaze tracker. Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications. (77-80).

    https://doi.org/10.1145/1743666.1743685

  • Hansen D and Ji Q. (2010). In the Eye of the Beholder. IEEE Transactions on Pattern Analysis and Machine Intelligence. 32:3. (478-500). Online publication date: 1-Mar-2010.

    https://doi.org/10.1109/TPAMI.2009.30

  • Aoki H, Hansen J and Itoh K. (2009). Learning gaze typing: what are the obstacles and what progress to expect?. Universal Access in the Information Society. 8:4. (297-310). Online publication date: 27-Oct-2009.

    https://doi.org/10.1007/s10209-009-0152-5

  • Mollenbach E, Hansen J, Lillholm M and Gale A. Single stroke gaze gestures. CHI '09 Extended Abstracts on Human Factors in Computing Systems. (4555-4560).

    https://doi.org/10.1145/1520340.1520699

  • San Agustin J, Skovsgaard H, Hansen J and Hansen D. Low-cost gaze interaction. CHI '09 Extended Abstracts on Human Factors in Computing Systems. (4453-4458).

    https://doi.org/10.1145/1520340.1520682

  • San Agustin J, Hansen J, Hansen D and Skovsgaard H. Low-cost gaze pointing and EMG clicking. CHI '09 Extended Abstracts on Human Factors in Computing Systems. (3247-3252).

    https://doi.org/10.1145/1520340.1520466