Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3373625.3416987acmconferencesArticle/Chapter ViewAbstractPublication PagesassetsConference Proceedingsconference-collections
short-paper

Eyelid Gestures on Mobile Devices for People with Motor Impairments

Published: 29 October 2020 Publication History

Abstract

Eye-based interactions for people with motor impairments have often used clunky or specialized equipment (e.g., eye-trackers with non-mobile computers) and primarily focused on gaze and blinks. However, two eyelids can open and close for different duration in different orders to form various eyelid gestures. We take a first step to design, detect, and evaluate a set of eyelid gestures for people with motor impairments on mobile devices. We present an algorithm to detect nine eyelid gestures on smartphones in real-time and evaluate it with twelve able-bodied people and four people with severe motor impairments in two studies. The results of the study with people with motor-impairments show that the algorithm can detect the gestures with .76 and .69 overall accuracy in user-dependent and user-independent evaluations. Moreover, we design and evaluate a gesture mapping scheme allowing for navigating mobile applications only using eyelid gestures. Finally, we present recommendations for designing and using eyelid gestures for people with motor impairments.

References

[1]
Mark V Albert, Cliodhna McCarthy, Juliana Valentin, Megan Herrmann, Konrad Kording, and Arun Jayaraman. 2013. Monitoring functional capability of individuals with lower limb amputations using mobile phones. PLoS One 8, 6 (2013).
[2]
Mark V Albert, Santiago Toledo, Mark Shapiro, and Konrad Koerding. 2012. Using mobile phones for activity recognition in Parkinson’s patients. Frontiers in neurology 3(2012), 158.
[3]
Fabio Aloise, Francesca Schettini, Pietro Arico, Serenella Salinari, Fabio Babiloni, and Febo Cincotti. 2012. A comparison of classification techniques for a gaze-independent P300-based brain–computer interface. Journal of neural engineering 9, 4 (2012), 045012.
[4]
Jacopo M Araujo, Guangtao Zhang, John Paulin Paulin Hansen, and Sadasivan Puthusserypady. 2020. Exploring Eye-Gaze Wheelchair Control. In ACM Symposium on Eye Tracking Research and Applications. 1–8.
[5]
T. Arroyo-Gallego, M. J. Ledesma-Carbayo, Á. Sánchez-Ferro, I. Butterworth, C. S. Mendoza, M. Matarazzo, P. Montero, R. López-Blanco, V. Puertas-Martín, R. Trincado, and L. Giancardo. 2017. Detection of Motor Impairment in Parkinson’s Disease Via Mobile Touchscreen Typing. IEEE Transactions on Biomedical Engineering 64, 9 (2017), 1994–2002.
[6]
Behrooz Ashtiani and I Scott MacKenzie. 2010. BlinkWrite2: an improved text entry method using eye blinks. In Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications. 339–345.
[7]
ALS Association. 2020. Who Gets ALS?http://www.alsa.org/about-als/facts-you-should-know.html
[8]
Rafael Barea, Luciano Boquete, Luis Miguel Bergasa, Elena López, and Manuel Mazo. 2003. Electro-oculographic guidance of a wheelchair using eye movements codification. The International Journal of Robotics Research 22, 7-8(2003), 641–652.
[9]
Veronika Maria Berger, Gerhard Nussbaum, Carina Emminger, and Zoltan Major. 2018. 3D-Printing of Personalized Assistive Technology. In International Conference on Computers Helping People with Special Needs. Springer, 135–142.
[10]
Eric Corbett and Astrid Weber. 2016. What can I say? addressing user experience challenges of a mobile voice user interface for accessibility. In Proceedings of the 18th International Conference on Human-Computer Interaction with Mobile Devices and Services. 72–82.
[11]
National Center for Health Statistics 2018. Summary health statistics: National Health Interview Survey.
[12]
Kristen Grauman, Margrit Betke, Jonathan Lombardi, James Gips, and Gary R Bradski. 2003. Communication via eye blinks and eyebrow raises: Video-based human-computer interfaces. Universal Access in the Information Society 2, 4 (2003), 359–373.
[13]
Anthony J Hornof and Anna Cavender. 2005. EyeDraw: enabling children with severe motor impairments to draw with their eyes. In Proceedings of the SIGCHI conference on Human factors in computing systems. 161–170.
[14]
Shuai Huang, Chun Luo, Shiwei Ye, Fei Liu, Bin Xie, Caifeng Wang, Li Yang, Zhen Huang, and Jiankang Wu. 2012. Motor impairment evaluation for upper limb in stroke patients on the basis of a microsensor. International journal of rehabilitation research 35, 2 (2012), 161–169.
[15]
Ricardo Jota and Daniel Wigdor. 2015. Palpebrae Superioris: Exploring the Design Space of Eyelid Gestures. In Proceedings of the 41st Graphics Interface Conference. Canadian Human-Computer Communications Society, Toronto, Ontario, Canada, 3–5. https://doi.org/10.20380/GI2015.35
[16]
Aleksandra Królak and Paweł Strumiłło. 2012. Eye-blink detection system for human–computer interaction. Universal Access in the Information Society 11, 4 (2012), 409–419.
[17]
Chris Lankford. 2000. Effective eye-gaze input into windows. In Proceedings of the 2000 symposium on Eye tracking research & applications. 23–27.
[18]
Luis Leiva, Matthias Böhmer, Sven Gehring, and Antonio Krüger. 2012. Back to the app: the costs of mobile application interruptions. In Proceedings of the 14th international conference on Human-computer interaction with mobile devices and services - MobileHCI ’12. ACM Press, New York, New York, USA, 291. https://doi.org/10.1145/2371574.2371617
[19]
Dongheng Li, David Winfield, and Derrick J Parkhurst. 2005. Starburst: A hybrid algorithm for video-based eye tracking combining feature-based and model-based approaches. In 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’05)-Workshops. IEEE, 79–79.
[20]
Google LLC. 2019. Mobile Vision | Google Developers. https://developers.google.com/vision/
[21]
Robert Gabriel Lupu, Florina Ungureanu, and Valentin Siriteanu. 2013. Eye tracking mouse for human computer interaction. In 2013 E-Health and Bioengineering Conference (EHB). IEEE, 1–4.
[22]
I Scott MacKenzie and Behrooz Ashtiani. 2011. BlinkWrite: efficient text entry using eye blinks. Universal Access in the Information Society 10, 1 (2011), 69–80.
[23]
Meredith Ringel Morris, Jacob O. Wobbrock, and Andrew D. Wilson. 2010. Understanding users’ preferences for surface gestures. In Proceedings of Graphics Interface. Canadian Information Processing Society, Toronto, Ontario, Canada, 261–268. https://doi.org/10.1016/j.actamat.2009.07.058
[24]
Maia Naftali and Leah Findlater. 2014. Accessibility in context: understanding the truly mobile experience of smartphone users with motor impairments. In Proceedings of the 16th international ACM SIGACCESS conference on Computers & accessibility. 209–216.
[25]
Fabian Pedregosa, Gaël Varoquaux, Alexandre Gramfort, Vincent Michel, Bertrand Thirion, Olivier Grisel, Mathieu Blondel, Peter Prettenhofer, Ron Weiss, Vincent Dubourg, 2011. Scikit-learn: Machine learning in Python. the Journal of machine Learning research 12 (2011), 2825–2830.
[26]
Diogo Pedrosa, Maria Da Graça Pimentel, Amy Wright, and Khai N Truong. 2015. Filteryedping: Design challenges and user performance of dwell-free eye typing. ACM Transactions on Accessible Computing (TACCESS) 6, 1 (2015), 1–37.
[27]
Alisha Pradhan, Kanika Mehta, and Leah Findlater. 2018. ” Accessibility Came by Accident” Use of Voice-Controlled Intelligent Personal Assistants by People with Disabilities. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. 1–13.
[28]
B. P. Printy, L. M. Renken, J. P. Herrmann, I. Lee, B. Johnson, E. Knight, G. Varga, and D. Whitmer. 2014. Smartphone application for classification of motor impairment severity in Parkinson’s disease. In 2014 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society. 2686–2689.
[29]
David Rozado, Jason Niu, and Martin Lochner. 2017. Fast human-computer interaction by combining gaze pointing and face gestures. ACM Transactions on Accessible Computing (TACCESS) 10, 3 (2017), 1–18.
[30]
Jaime Ruiz, Yang Li, and Edward Lank. [n.d.]. In Proceedings of the 2011 annual conference on Human factors in computing systems - CHI ’11. New York, New York, USA. https://doi.org/10.1145/1978942.1978971
[31]
Javier San Agustin, Henrik Skovsgaard, Emilie Mollenbach, Maria Barret, Martin Tall, Dan Witzner Hansen, and John Paulin Hansen. 2010. Evaluation of a low-cost open-source gaze tracker. In Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications. 77–80.
[32]
Andrew Sears, Clare-Marie Karat, Kwesi Oseitutu, Azfar Karimullah, and Jinjuan Feng. 2001. Productivity, satisfaction, and interaction strategies of individuals with spinal cord injuries and traditional users interacting with speech recognition software. Universal Access in the information Society 1, 1 (2001), 4–15.
[33]
Corten Clemente Singer and Björn Hartmann. 2019. See-Thru: Towards Minimally Obstructive Eye-Controlled Wheelchair Interfaces. In The 21st International ACM SIGACCESS Conference on Computers and Accessibility. 459–469.
[34]
Ruth Smith. 1989. Mouth stick design for the client with spinal cord injury. American Journal of Occupational Therapy 43, 4 (1989), 251–255.
[35]
Young Chol Song. 2010. Joystick text entry with word prediction for people with motor impairments. In Proceedings of the 12th international ACM SIGACCESS conference on Computers and accessibility. 321–322.
[36]
John A Stern, Larry C Walrath, and Robert Goldstein. 1984. The endogenous eyeblink. Psychophysiology 21, 1 (1984), 22–33.
[37]
Non-Hispanic White and Non-Hispanic Black. 2016. Spinal cord injury (SCI) facts and figures at a glance. Birmingham: National Spinal Cord Injury Statistical Center, Facts and Figures at a Glance (2016).
[38]
Jacob Wobbrock and Brad Myers. 2006. Trackball text entry for people with motor impairments. In Proceedings of the SIGCHI conference on Human Factors in computing systems. 479–488.
[39]
Jacob O Wobbrock, Brad A Myers, and Htet Htet Aung. 2004. Joystick text entry with date stamp, selection keyboard, and EdgeWrite. In CHI’04 Extended Abstracts on Human Factors in Computing Systems. 1550–1550.
[40]
Jacob O Wobbrock, Brad A Myers, Htet Htet Aung, and Edmund F LoPresti. 2003. Text entry from power wheelchairs: EdgeWrite for joysticks and touchpads. In Proceedings of the 6th international ACM SIGACCESS conference on Computers and accessibility. 110–117.
[41]
Jacob O Wobbrock, James Rubinstein, Michael W Sawyer, and Andrew T Duchowski. 2008. Longitudinal evaluation of discrete consecutive gaze gestures for text entry. In Proceedings of the 2008 symposium on Eye tracking research & applications. 11–18.
[42]
Guangtao Zhang and John Paulin Hansen. 2020. People with Motor Disabilities Using Gaze to Control Telerobots. In Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems. 1–9.
[43]
Xiaoyi Zhang, Harish Kulkarni, and Meredith Ringel Morris. 2017. Smartphone-based gaze gesture communication for people with motor disabilities. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. 2878–2889.

Cited By

View all
  • (2024)A3C: An Image-Association-Based Computing Device Authentication Framework for People with Upper Extremity ImpairmentsACM Transactions on Accessible Computing10.1145/365252217:2(1-37)Online publication date: 19-Mar-2024
  • (2024)Designing Upper-Body Gesture Interaction with and for People with Spinal Muscular Atrophy in VRProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642884(1-19)Online publication date: 11-May-2024
  • (2023)An End-to-End Review of Gaze Estimation and its Interactive Applications on Handheld Mobile DevicesACM Computing Surveys10.1145/360694756:2(1-38)Online publication date: 15-Sep-2023
  • Show More Cited By

Index Terms

  1. Eyelid Gestures on Mobile Devices for People with Motor Impairments
    Index terms have been assigned to the content through auto-classification.

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    ASSETS '20: Proceedings of the 22nd International ACM SIGACCESS Conference on Computers and Accessibility
    October 2020
    764 pages
    ISBN:9781450371032
    DOI:10.1145/3373625
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 29 October 2020

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. eyelid gestures
    2. mobile interaction
    3. people with motor impairments

    Qualifiers

    • Short-paper
    • Research
    • Refereed limited

    Conference

    ASSETS '20
    Sponsor:

    Acceptance Rates

    ASSETS '20 Paper Acceptance Rate 46 of 167 submissions, 28%;
    Overall Acceptance Rate 436 of 1,556 submissions, 28%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)64
    • Downloads (Last 6 weeks)3
    Reflects downloads up to 19 Nov 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)A3C: An Image-Association-Based Computing Device Authentication Framework for People with Upper Extremity ImpairmentsACM Transactions on Accessible Computing10.1145/365252217:2(1-37)Online publication date: 19-Mar-2024
    • (2024)Designing Upper-Body Gesture Interaction with and for People with Spinal Muscular Atrophy in VRProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642884(1-19)Online publication date: 11-May-2024
    • (2023)An End-to-End Review of Gaze Estimation and its Interactive Applications on Handheld Mobile DevicesACM Computing Surveys10.1145/360694756:2(1-38)Online publication date: 15-Sep-2023
    • (2023)How Do People with Limited Movement Personalize Upper-Body Gestures? Considerations for the Design of Personalized and Accessible Gesture InterfacesProceedings of the 25th International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3597638.3608430(1-15)Online publication date: 22-Oct-2023
    • (2023)Seeking information about assistive technologyInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2023.103078177:COnline publication date: 1-Sep-2023
    • (2022)Methodological Standards in Accessibility Research on Motor Impairments: A SurveyACM Computing Surveys10.1145/354350955:7(1-35)Online publication date: 15-Dec-2022
    • (2022)“It Feels Like Being Locked in A Cage”: Understanding Blind or Low Vision Streamers’ Perceptions of Content Curation AlgorithmsProceedings of the 2022 ACM Designing Interactive Systems Conference10.1145/3532106.3533514(571-585)Online publication date: 13-Jun-2022
    • (2022)Wigglite: Low-cost Information Collection and TriageProceedings of the 35th Annual ACM Symposium on User Interface Software and Technology10.1145/3526113.3545661(1-16)Online publication date: 29-Oct-2022
    • (2022)“I Don’t Want People to Look At Me Differently”Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3517552(1-15)Online publication date: 29-Apr-2022
    • (2022)Crystalline: Lowering the Cost for Developers to Collect and Organize Information for Decision MakingProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3501968(1-16)Online publication date: 29-Apr-2022
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media