Most work on the usability of touchscreen interaction for people with motor impairments has focused on lab studies with relatively few participants and small cross-sections of the population. To develop a richer characterization of use, we turned to a previously untapped source of data: YouTube videos. We collected and analyzed 187 non-commercial videos uploaded to YouTube that depicted a person with a physical disability interacting with a mainstream mobile touchscreen device. We coded the videos along a range of dimensions to characterize the interaction, the challenges encountered, and the adaptations being adopted in daily use. To complement the video data, we also invited the video uploaders to complete a survey on their ongoing use of touchscreen technology. Our findings show that, while many people with motor impairments find these devices empowering, accessibility issues still exist. In addition to providing implications for more accessible touchscreen design, we reflect on the application of user-generated content to study user interface design.
References
[1]
Abascal, J. & Civit, A. (2009). Mobile communication for people with disabilities and older people: new opportunities for autonomous life. Proc. 6th ERCIM Workshop, 255--268.
Belatar, M. & Poirier, F. (2008). Text entry for mobile devices and users with severe motor impairments: handiglyph, a primitive shapes based onscreen keyboard. Proc. ASSETS 2008, 209--216.
Biswas, P. & Langdon, P. (2012). Developing multimodal adaptation algorithm for mobility impaired users by evaluating their hand strength. Int J Hum-Comput Int, 28(9), 576--596.
Duff, S. N., Irwin, C. B., Skye, J. L., Sesto, M. E., Wiegmann, D. A. (2010). The effect of disability and approach on touch screen performance during a number entry task. Proc. HFES Annual Meeting 2010, 566--570.
Findlater, L., Jansen, A., Shinohara, K., Dixon, M., Kamb, P., Rakita, J., Wobbrock, J. O. (2010). Enhanced area cursors: Reducing fine pointing demands for people with motor impairments. Proc. UIST 2010, 153--162.
Froehlich, J., Wobbrock, J. O., Kane, S. K. (2007). Barrier pointing: using physical edges to assist target acquisition on mobile device touch screens. Proc. ASSETS 2007, 19--26.
Gajos, K. Z., Wobbrock, J. O., Weld, D. S. (2008). Improving the performance of motor-impaired users with automatically-generated, ability-based interfaces. Proc. CHI 2008, 1257--1266.
Hourcade, J. P., Bederson, B. B., Druin, A., Guimbretiere, F. (2004). Differences in pointing task performance between preschool children and adults using mice. ACM TOCHI, 11(4), 357--386.
Hwang, F., Keates, S., Langdon, P., Clarkson, P. J. (2004). Mouse movements of motion-impaired users: A submovement analysis. Proc. ASSETS 2004, 102--109.
Irwin, C. B. & Sesto, M. E. (2012). Performance and touch characteristics of disabled and non-disabled participants during a reciprocal tapping task using touch screen technology. Applied Ergonomics, 43(6), 1038--1043.
Jang, S. H. (2011). YouTube as an innovative resource for social science research. Proc. Australian Association for Research in Education Conference 2011, 1--16.
Kane, S. K., Bigham, J. P., Wobbrock, J. O. (2008). Slide rule: Making mobile touch screens accessible to blind people using multi-touch interaction techniques. Proc. ASSETS 2008, 73--80.
Kane, S. K., Jayant, C., Wobbrock, J. O., Ladner, R. E. (2009). Freedom to roam: a study of mobile device adoption and accessibility for people with visual and motor disabilities. Proc. ASSETS 2009, 115--122.
Keelan, J., Pavri-Garcia, V., Tomlinson, G., Wilson, K. (2007). YouTube as a source of information on immunization: a Content analysis. JAMA-J Am Med Assoc, 298(21), 2482--2484.
McGookin, D., Brewster, S., Jiang, WeiWei. (2008). Investigating touchscreen accessibility for people with visual impairments. Proc. NordiCHI 2008, 298--307.
Paek, H.-J., Kim, K., Hove, T. (2010). Content analysis of antismoking videos on YouTube: message sensation value, message appeals, and their relationships with viewer responses. Health Education Research, 25(6), 1085--1099.
Power, M. R. & Power, D. (2004). Everyone here speaks TXT: deaf people using SMS in Australia and the rest of the world. J. Deaf Stud. Deaf Educ., 9(3), 333--343.
Trewin, S., Keates, S., Moffatt, K. (2006). Developing steady clicks: a method of cursor assistance for people with motor impairments. Proc. ASSETS 2006, 26--33.
Wacharamanotham, C., Hurtmanns, J., Mertens, A., Kronenbuerger, M., Schlick, C., Borchers, J. (2011). Evaluating swabbing: a touchscreen input method for elderly users with tremor. Proc. CHI 2011, 623--626.
Wobbrock, J. O., Fogarty, J., Liu, S., Kimuro, S., Harada, S. (2009). The angle mouse: target-agnostic dynamic gain adjustment based on angular deviation. Proc. CHI 2009, 1401--1410.
Wobbrock, J. O., Myers, B. A. and Kembel, J. A. (2003). EdgeWrite: a stylus-based text entry method designed for high accuracy and stability of motion. Proc. UIST 2003, 61--70.
Worden, A., Walker, N., Bharat, K., Hudson, S. (1997). Making computers easier for older adults to use: area cursors and sticky icons. Proc. CHI 1997, 266--271.
Niu SLiu LBian Y(2024)Please Understand My Disability: An Analysis of YouTubers' Discourse on Disability ChallengesProceedings of the ACM on Human-Computer Interaction10.1145/36869468:CSCW2(1-25)Online publication date: 8-Nov-2024
Yu XHoggenmüller MTran TWang YTomitsch M(2024)Understanding the Interaction between Delivery Robots and Other Road and Sidewalk Users: A Study of User-generated Online VideosACM Transactions on Human-Robot Interaction10.1145/367761513:4(1-32)Online publication date: 23-Oct-2024
Lee LLin Z(2024)Danger, Nuisance, Disregard: Analyzing User-Generated Videos for Augmented Reality Gameplay on Hand-held DevicesProceedings of the ACM on Human-Computer Interaction10.1145/36770638:CHI PLAY(1-33)Online publication date: 15-Oct-2024
CHI '17: Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems
Controlled studies of touchscreen input performance for users with upper body motor impairments remain relatively sparse. To address this gap, we present a controlled lab study of mouse vs. touchscreen performance with 32 participants (16 with upper ...
NordiCHI '08: Proceedings of the 5th Nordic conference on Human-computer interaction: building bridges
Touchscreen computing devices such as the iPhone are becoming more common. However this technology is largely inaccessible to people with visual impairments. We present the results of a requirements capture study that illustrates the problems with ...
TVX '18: Proceedings of the 2018 ACM International Conference on Interactive Experiences for TV and Online Video
In this paper, we analyzed videos to explore blind or visually impaired (BVI) people on YouTube. While researchers found how BVI people interact with contents and other people on social media platforms (e.g., Facebook), little is known about the ...
People with disabilities face considerable challenges when using touchscreens. The authors of this paper investigate how well touchscreen devices work out of the box, evaluating the extent to which they affect interaction and how disabled users adapt to improve usability. They assembled a YouTube videos dataset for the investigation and analysis based on a search in two categories: medical conditions (60 keywords, such as brain injury, Parkinson's, and so on) and technology terms (eight keywords, such as touchscreen, tablet, iPad, and so on). For their analysis, they separated the resulting dataset of 187 noncommercial videos into four groups based on video characteristics, device usage in the video, user characteristics, and type of user interaction. The results show that even though these touchscreen devices empower people with physical disabilities in some ways, accessibility challenges are far more significant.
This paper sheds light on some of the important characteristics of touchscreen use by motor-impaired users that could have profound effects on future human-computer interaction research. First, the authors categorize interaction styles: 91 percent of the videos show direct interactions using fingers, hands, or feet, while only eight percent show indirect interaction using an intermediary tool such as a mouthstick. The analysis reveals that people with motor impediments are not able to perform certain tasks required for touch-based apps. When they use hands, fists, knuckles, feet, and nose, they generally touch a broad surface area and may hold the touch for a long enough time that the app disregards the action. These findings can indeed become the basis for the development of new adaptation techniques for physically disabled users.
Second, the paper presents the characteristics of indirect interaction methods using tools such as headsticks, mouthsticks, styluses, arm and leg slings, and user posture. Although these tools serve as efficient intermediaries for interaction, people with severe physical disabilities still face several challenges in managing them. The survey conducted by the authors among the users who uploaded the study videos shows either extremely positive or extremely negative sentiment toward touchscreens. The positive sentiments are mostly driven by the affordability of touchscreen devices such as the iPad and the ability of some children with speech impediments to communicate using apps. Negative sentiments are driven by the difficulty of accessing and using some devices and apps. The survey also reveals problems with nonscreen hardware parts such as buttons, which are either too hard to push or too sensitive to inadvertent contact.
Finally, the authors present an extensive discussion of the improvements needed to adapt touchscreen devices for use by people with minor to severe physical disabilities. Suggestions include developing generic interaction models for the applications and tools targeted at people with various disabilities.
The authors also demonstrate the importance and efficiency of the dataset of YouTube videos and uploader surveys. They contend that their approach is effective for the analysis and characterization of human-computer interaction, especially when it comes to touchscreens and the accessibility issues of the physically disabled. The paper is well written, with extensive analysis. The authors have made an important and valuable contribution to the field of human-computer interaction.
Online Computing Reviews Service
Access critical reviews of Computing literature here
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]
Niu SLiu LBian Y(2024)Please Understand My Disability: An Analysis of YouTubers' Discourse on Disability ChallengesProceedings of the ACM on Human-Computer Interaction10.1145/36869468:CSCW2(1-25)Online publication date: 8-Nov-2024
Yu XHoggenmüller MTran TWang YTomitsch M(2024)Understanding the Interaction between Delivery Robots and Other Road and Sidewalk Users: A Study of User-generated Online VideosACM Transactions on Human-Robot Interaction10.1145/367761513:4(1-32)Online publication date: 23-Oct-2024
Lee LLin Z(2024)Danger, Nuisance, Disregard: Analyzing User-Generated Videos for Augmented Reality Gameplay on Hand-held DevicesProceedings of the ACM on Human-Computer Interaction10.1145/36770638:CHI PLAY(1-33)Online publication date: 15-Oct-2024
Baltaxe-Admony LDuval JRingland K(2024)DREEM: Moving from Empathy to Enculturation in Disability Related Human-Centered DesignProceedings of the 26th International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3663548.3675642(1-17)Online publication date: 27-Oct-2024
Shahi SMollyn VTymoszek Park CKang RLiberman ALevy OGong JBedri ALaput G(2024)Vision-Based Hand Gesture Customization from a Single DemonstrationProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676378(1-14)Online publication date: 13-Oct-2024
Motahar TBrown NWiese EWiese J(2024)Toward Building Design Empathy for People with Disabilities Using Social Media Data: A New Approach for Novice DesignersProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3660687(3145-3160)Online publication date: 1-Jul-2024
Cao JPeng XLiang FTong X(2024)"Voices Help Correlate Signs and Words": Analyzing Deaf and Hard-of-Hearing (DHH) TikTokers’ Content, Practices, and PitfallsProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642413(1-18)Online publication date: 11-May-2024
Sun ZZhou RWang KLi Z(2024)Enhancing Mobile Interaction for Individuals With Tremors via Optical See-Through Augmented RealityIEEE Access10.1109/ACCESS.2024.344988012(123946-123955)Online publication date: 2024