Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3489849.3489873acmconferencesArticle/Chapter ViewAbstractPublication PagesvrstConference Proceedingsconference-collections
research-article
Open access

Pressing a Button You Cannot See: Evaluating Visual Designs to Assist Persons with Low Vision through Augmented Reality

Published: 08 December 2021 Publication History

Abstract

Partial vision loss occurs in several medical conditions and affects persons of all ages. It compromises many daily activities, such as reading, cutting vegetables, or identifying and accurately pressing buttons, e.g., on ticket machines or ATMs. Touchscreen interfaces pose a particular challenge because they lack haptic feedback from interface elements and often require people with impaired vision to rely on others for help. We propose a smartglasses-based solution to utilize the user’s residual vision. Together with visually-impaired individuals, we designed assistive augmentations for touchscreen interfaces and evaluated their suitability to guide attention towards interface elements and to increase the accuracy of manual inputs. We show that augmentations improve interaction performance and decrease cognitive load, particularly for unfamiliar interface layouts.

Supplementary Material

VTT File (PressingAButtonYouCannotSee.vtt)
Supplemental document (PressingAButtonYouCannotSee_Supplemental.pdf)
MP4 File (PressingAButtonYouCannotSee.mp4)
Video figure
MP4 File (PressingAButtonYouCannotSee_Teaser.mp4)
Video teaser

References

[1]
Felix Bork, Christian Schnelzer, Ulrich Eck, and Nassir Navab. 2018. Towards Efficient Visual Guidance in Limited Field-of-View Head-Mounted Displays. IEEE Transactions on Visualization and Computer Graphics 24, 11(2018), 2983–2992. https://doi.org/10.1109/TVCG.2018.2868584
[2]
Andy Field, Jeremy Miles, and Zoë Field. 2012. Discovering Statistics Using R. SAGE Publications Ltd. 978-1-446-20045-2.
[3]
Giovanni Fusco, Ender Tekin, Richard E Ladner, and James M Coughlan. 2014. Using computer vision to access appliance displays. In Proceedings of the 16th international ACM SIGACCESS conference on Computers & accessibility. 281–282.
[4]
E.B. Goldstein. 2009. Sensation and Perception. Cengage Learning. https://books.google.de/books?id=2tW91BWeNq4C
[5]
Uwe Gruenefeld, Dag Ennenga, Abdallah El Ali, Wilko Heuten, and Susanne Boll. 2017. EyeSee360: Designing a Visualization Technique for out-of-View Objects in Head-Mounted Augmented Reality. In Proceedings of the 5th Symposium on Spatial User Interaction (Brighton, United Kingdom) (SUI ’17). Association for Computing Machinery, New York, NY, USA, 109–118. https://doi.org/10.1145/3131277.3132175
[6]
Anhong Guo, Xiang Chen, Haoran Qi, Samuel White, Suman Ghosh, Chieko Asakawa, and Jeffrey Bigham. 2016. VizLens: A Robust and Interactive Screen Reader for Interfaces in the Real World. 651–664. https://doi.org/10.1145/2984511.2984518
[7]
Anhong Guo, Jeeeun Kim, Xiang ’Anthony’ Chen, Tom Yeh, Scott E. Hudson, Jennifer Mankoff, and Jeffrey P. Bigham. 2017. Facade: Auto-Generating Tactile Interfaces to Appliances. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (Denver, Colorado, USA) (CHI ’17). Association for Computing Machinery, New York, NY, USA, 5826–5838. https://doi.org/10.1145/3025453.3025845
[8]
Anhong Guo, Junhan Kong, Michael Rivera, Frank F. Xu, and Jeffrey P. Bigham. 2019. StateLens: A Reverse Engineering Solution for Making Existing Dynamic Touchscreens Accessible. In Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology (New Orleans, LA, USA) (UIST ’19). Association for Computing Machinery, New York, NY, USA, 371–385. https://doi.org/10.1145/3332165.3347873
[9]
Sandra Hart. 2006. Nasa-task load index (Nasa-TLX); 20 years later. Proceedings of the Human Factors and Ergonomics Society Annual Meeting 50. https://doi.org/10.1177/154193120605000909
[10]
Sandra G Hart and Lowell E Staveland. 1988. Development of NASA-TLX (Task Load Index): Results of empirical and theoretical research. Human mental workload 1, 3 (1988), 139–183.
[11]
Jonathan Huang, Max Kinateder, Matt Dunn, Wojciech Jarosz, Xing-Dong Yang, and Emily Cooper. 2019. An augmented reality sign-reading assistant for users with reduced vision. PLOS ONE 14 (01 2019), e0210630. https://doi.org/10.1371/journal.pone.0210630
[12]
Yvonne Jansen, Thorsten Karrer, and Jan Borchers. 2010. MudPad: Tactile Feedback and Haptic Texture Overlay for Touch Surfaces. In ACM International Conference on Interactive Tabletops and Surfaces (Saarbrücken, Germany) (ITS ’10). Association for Computing Machinery, New York, NY, USA, 11–14. https://doi.org/10.1145/1936652.1936655
[13]
Albouys Perrois Jérémy, Jeremy Laviole, Carine Briant, and Anke Brock. 2018. Towards a Multisensory Augmented Reality Map for Blind and Low Vision People: a Participatory Design Approach. 1–14. https://doi.org/10.1145/3173574.3174203
[14]
Shaun Kane, Meredith Morris, and Jacob Wobbrock. 2013. Touchplates: Low-Cost Tactile Overlays for Visually Impaired Touch Screen Users. Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility, ASSETS 2013. https://doi.org/10.1145/2513383.2513442
[15]
Shaun K. Kane, Chandrika Jayant, Jacob O. Wobbrock, and Richard E. Ladner. 2009. Freedom to Roam: A Study of Mobile Device Adoption and Accessibility for People with Visual and Motor Disabilities. In Proceedings of the 11th International ACM SIGACCESS Conference on Computers and Accessibility (Pittsburgh, Pennsylvania, USA) (Assets ’09). Association for Computing Machinery, New York, NY, USA, 115–122. https://doi.org/10.1145/1639642.1639663
[16]
Junhan Kong, Anhong Guo, and Jeffrey P. Bigham. 2019. Supporting Older Adults in Using Complex User Interfaces with Augmented Reality. In The 21st International ACM SIGACCESS Conference on Computers and Accessibility(Pittsburgh, PA, USA) (ASSETS ’19). Association for Computing Machinery, New York, NY, USA, 661–663. https://doi.org/10.1145/3308561.3354593
[17]
[17] Koolertron.2021. http://www.koolertron.com/electronics-digital-magnifier-c-106_108.html
[18]
Florian Lang, Albrecht Schmidt, and Tonja Machulla. 2020. Augmented Reality for People with Low Vision: Symbolic and Alphanumeric Representation of Information. In Computers Helping People with Special Needs, Klaus Miesenberger, Roberto Manduchi, Mario Covarrubias Rodriguez, and Petr Peňáz (Eds.). Springer International Publishing, Cham, 146–156.
[19]
Anabel Martín-González. 2011. Advanced imaging in head-mounted displays for patients with age-related macular degeneration. Ph.D. Dissertation. Technical University Munich. http://nbn-resolving.de/urn:nbn:de:bvb:91-diss-20110928-1079926-1-1
[20]
David Mcgookin, Stephen Brewster, and WeiWei Jiang. 2008. Investigating Touchscreen Accessibility for People with Visual Impairments. ACM International Conference Proceeding Series 358. https://doi.org/10.1145/1463160.1463193
[21]
Suzanne Mckee and Ken Nakayama. 1984. The Detection of Motion in the Peripheral Visual Field. Vision research 24 (02 1984), 25–32. https://doi.org/10.1016/0042-6989(84)90140-8
[22]
[22] NuEyes.2021. https://nueyes.com/
[23]
[23] OrCam.2021. https://www.orcam.com
[24]
D. T. V. Pawluk, R. J. Adams, and R. Kitada. 2015. Designing Haptic Assistive Technology for Individuals Who Are Blind or Visually Impaired. IEEE Transactions on Haptics 8, 3 (2015), 258–278. https://doi.org/10.1109/TOH.2015.2471300
[25]
Patrick Renner and Thies Pfeiffer. 2017. Attention guiding techniques using peripheral vision and eye tracking for feedback in augmented-reality-based assistance systems. 186–194. https://doi.org/10.1109/3DUI.2017.7893338
[26]
Teresa Siu and Valeria Herskovic. 2013. SidebARs: Improving Awareness of off-Screen Elements in Mobile Augmented Reality. In Proceedings of the 2013 Chilean Conference on Human - Computer Interaction (Temuco, Chile) (ChileCHI ’13). Association for Computing Machinery, New York, NY, USA, 36–41. https://doi.org/10.1145/2535597.2535608
[27]
Lee Stearns, Victor DeSouza, Jessica Yin, Leah Findlater, and Jon Froehlich. 2017. Augmented Reality Magnification for Low Vision Users with the Microsoft Hololens and a Finger-Worn Camera. 361–362. https://doi.org/10.1145/3132525.3134812
[28]
[28] VPIXX Technologies.2021. https://vpixx.com/products/touch-panel/
[29]
Ingrid U Scott, William Feuer, and Julie Jacko. 2002. Impact of graphical user interface screen features on computer task accuracy and speed in a cohort of patients with age-related macular degeneration. American journal of ophthalmology 134 (12 2002), 857–62. https://doi.org/10.1016/S0002-9394(02)01795-6
[30]
Ingrid U Scott, William Feuer, and Julie Jacko. 2002. Impact of Visual Function on Computer Task Accuracy and Reaction Time in A Cohort of Patients With Age-Related Macular Degeneration. American journal of ophthalmology 133 (03 2002), 350–7. https://doi.org/10.1016/S0002-9394(01)01406-4
[31]
Xiaoyi Zhang, Tracy Tran, Yuqian Sun, Ian Culhane, Shobhit Jain, James Fogarty, and Jennifer Mankoff. 2018. Interactiles: 3D Printed Tactile Interfaces to Enhance Mobile Touchscreen Accessibility. In Proceedings of the 20th International ACM SIGACCESS Conference on Computers and Accessibility (Galway, Ireland) (ASSETS ’18). Association for Computing Machinery, New York, NY, USA, 131–142. https://doi.org/10.1145/3234695.3236349
[32]
Yuhang Zhao, Edward Cutrell, Christian Holz, Meredith Ringel Morris, Eyal Ofek, and Andrew D. Wilson. 2019. SeeingVR: A Set of Tools to Make Virtual Reality More Accessible to People with Low Vision. 1–14. https://doi.org/10.1145/3290605.3300341
[33]
Yuhang Zhao, Elizabeth Kupferstein, Brenda Veronica Castro, Steven Feiner, and Shiri Azenkot. 2019. Designing AR Visualizations to Facilitate Stair Navigation for People with Low Vision. In Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology (New Orleans, LA, USA) (UIST ’19). Association for Computing Machinery, New York, NY, USA, 387–402. https://doi.org/10.1145/3332165.3347906
[34]
Yuhang Zhao, Sarit Szpiro, and Shiri Azenkot. 2015. ForeSee: A Customizable Head-Mounted Vision Enhancement System for People with Low Vision. 239–249. https://doi.org/10.1145/2700648.2809865
[35]
Yuhang Zhao, Sarit Szpiro, Jonathan Knighten, and Shiri Azenkot. 2016. CueSee: exploring visual cues for people with low vision to facilitate a visual search task. 73–84. https://doi.org/10.1145/2971648.2971730
[36]
[36] ZoomText.2021. https://www.zoomtext.com/

Cited By

View all
  • (2024)Survey of visualization methods for multiscene visual cue information in immersive environmentsJournal of Image and Graphics10.11834/jig.22114729:1(1-21)Online publication date: 2024
  • (2024)CookAR: Affordance Augmentations in Wearable AR to Support Kitchen Tool Interactions for People with Low VisionProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676449(1-16)Online publication date: 13-Oct-2024
  • (2024)GazePrompt: Enhancing Low Vision People's Reading Experience with Gaze-Aware AugmentationsProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642878(1-17)Online publication date: 11-May-2024
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
VRST '21: Proceedings of the 27th ACM Symposium on Virtual Reality Software and Technology
December 2021
563 pages
ISBN:9781450390927
DOI:10.1145/3489849
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike International 4.0 License.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 08 December 2021

Check for updates

Author Tags

  1. Accessibility
  2. Augmented Reality
  3. Low Vision

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Funding Sources

Conference

VRST '21

Acceptance Rates

Overall Acceptance Rate 66 of 254 submissions, 26%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)330
  • Downloads (Last 6 weeks)35
Reflects downloads up to 23 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Survey of visualization methods for multiscene visual cue information in immersive environmentsJournal of Image and Graphics10.11834/jig.22114729:1(1-21)Online publication date: 2024
  • (2024)CookAR: Affordance Augmentations in Wearable AR to Support Kitchen Tool Interactions for People with Low VisionProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676449(1-16)Online publication date: 13-Oct-2024
  • (2024)GazePrompt: Enhancing Low Vision People's Reading Experience with Gaze-Aware AugmentationsProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642878(1-17)Online publication date: 11-May-2024
  • (2023)Towards Accessible Augmented Reality Learning Authoring Tool: A Case of MirageXR2023 IST-Africa Conference (IST-Africa)10.23919/IST-Africa60249.2023.10187746(1-13)Online publication date: 31-May-2023
  • (2023)Inclusive Augmented and Virtual Reality: A Research AgendaInternational Journal of Human–Computer Interaction10.1080/10447318.2023.224761440:20(6200-6219)Online publication date: 27-Aug-2023
  • (2023)Inclusive AR/VR: accessibility barriers for immersive technologiesUniversal Access in the Information Society10.1007/s10209-023-00969-023:1(59-73)Online publication date: 2-Feb-2023
  • (2023)Inclusive Immersion: a review of efforts to improve accessibility in virtual reality, augmented reality and the metaverseVirtual Reality10.1007/s10055-023-00850-827:4(2989-3020)Online publication date: 12-Sep-2023
  • (2022)Application of Spatial Cues and Optical Distortions as Augmentations during Virtual Reality (VR) Gaming: The Multifaceted Effects of Assistance for Eccentric Viewing TrainingInternational Journal of Environmental Research and Public Health10.3390/ijerph1915957119:15(9571)Online publication date: 4-Aug-2022

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media