Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3411764.3445327acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

Augmented Reality Glasses as an Orientation and Mobility Aid for People with Low Vision: a Feasibility Study of Experiences and Requirements

Published: 07 May 2021 Publication History

Abstract

People with low vision experience reduced mobility that affects their physical and mental wellbeing. With augmented reality (AR) glasses, there are new opportunities to provide visual and auditory information that can improve mobility for this vulnerable group. Current research into AR-based mobility aids has focused mainly on the technical aspects, and less emphasis has been placed on understanding the usability and suitability of these aids in people with various levels of visual impairment. In this paper, we present the results of qualitative interviews with 18 participants using HoloLens v1 and eight prototype augmentations to understand how these enhancements are perceived by people with low vision and how these aids should be adjusted to suit their needs. Our results suggested that participants with moderate vision loss could potentially perceive the most benefit from glasses and underlined the importance of extensive customizability to accommodate the needs of a highly varied low vision population.

Supplementary Material

MP4 File (3411764.3445327_videofigure.mp4)
Supplemental video

References

[1]
Dragan Ahmetovic, Cole Gleason, Chengxiong Ruan, Kris Kitani, Hironobu Takagi, and Chieko Asakawa. 2016. NavCog: A Navigational Cognitive Assistant for the Blind. Proc. 18th Int. Conf. Human-Computer Interact. with Mob. Devices Serv. - MobileHCI ’16 (2016), 90–99.
[2]
Saleh Alghamdi, Ron Van Schyndel, and Ahmed Alahmadi. 2013. Indoor navigational aid using active RFID and QR-code for sighted and blind people. IEEE 8th Int. Conf. Intell. Sensors, Sens. Networks Inf. Process. 1, (2013), 18–22.
[3]
N Amin and M Borschbach. 2014. Classification criteria for local navigation digital assistance techniques for the visually impaired. In 2014 13th International Conference on Control Automation Robotics and Vision, ICARCV 2014, 1724–1728.
[4]
B. Ando, S. Baglio, V. Marletta, and A. Valastro. 2015. A Haptic Solution to Assist Visually Impaired in Mobility Tasks. IEEE Trans. Human-Machine Syst. 45, 5 (2015), 641–646.
[5]
Anastasios Nikolas Angelopoulos, Hossein Ameri, Debbie Mitra, and Mark Humayun. 2019. Enhanced Depth Navigation Through Augmented Reality Depth Mapping in Patients with Low Vision. Sci. Rep. 9, 1 (2019), 11230.
[6]
Henry Apfelbaum and Eli Peli. 2015. Tunnel Vision Prismatic Field Expansion: Challenges and Requirements. Transl. Vis. Sci. Technol. 4, 6 (2015), 8.
[7]
Malika Auvray, Sylvain Hanneton, and J. Kevin O'Regan. 2007. Learning to perceive with a visuo-auditory substitution system: Localisation and object recognition with “The vOICe.” Perception 36, 3 (2007), 416–430.
[8]
Mauro Avila Soto, Markus Funk, Matthias Hoppe, Robin Boldt, Katrin Wolf, and Niels Henze. 2017. DroneNavigator. Proc. 19th Int. ACM SIGACCESS Conf. Comput. Access. - ASSETS ’17 October (2017), 300–304.
[9]
Jinqiang Bai, Shiguo Lian, Zhaoxiang Liu, Kai Wang, and Dijun Liu. 2017. Smart Guiding Glasses for Visually Impaired People in Indoor Environment. IEEE Trans. Consum. Electron. 63, 3 (August 2017), 258–266.
[10]
Jorge Iranzo Bartolome, Luis Cavazos Quero, Sunhee Kim, Myung Yong Um, and Jundong Cho. 2019. Exploring aRt with a voice controlled multimodal guide for blind people. In TEI 2019 - Proceedings of the 13th International Conference on Tangible, Embedded, and Embodied Interaction, 383–390.
[11]
Alex Black, Jan E. Lovie-Kitchin, Russell L. Woods, Nicole Arnold, John Byrnes, and Jane Murrish. 1997. Mobility performance with retinitis pigmentosa. Clin. Exp. Optom. 80, 1 (1997), 1–12.
[12]
B.B. Blasch, S.J. LaGrow, and De L'Aune. 1996. Three aspects of coverage provided by the long cane: Object, surface, and foot-placement preview. Journal of Visual Impairment & Blindness.
[13]
Johann Borenstein and Iwan Ulrich. 2001. Applying Mobile Robot Technologies to Assist the Visual Impaired. Guid. 31, 2 (2001), 131–136.
[14]
Alex R. Bowers, Gang Luo, Noa M. Rensing, and Eli Peli. 2004. Evaluation of a prototype Minified Augmented-View device for patients with impaired night vision. Ophthalmic Physiol. Opt. 24, 4 (2004), 296–312.
[15]
Michael Brock and Per Ola Kristensson. 2013. Supporting blind navigation using depth sensing and sonification. (2013), 255–258.
[16]
Diana M. Brouwer, Gaynor Sadlo, Karen Winding, and Marianne I.G. Hanneman. 2008. Limitations in mobility: Experiences of visually impaired older people. Br. J. Occup. Ther. 71, 10 (2008), 414–421.
[17]
Stefano Burigat and Luca Chittaro. 2012. Mobile Navigation and Information Services for Disabled Students in University Buildings: a Needs Assessment Investigation. Proc. 2nd Work. Mob. Access. Mob. HCI 2012 Conf. (2012). Retrieved from https://pdfs.semanticscholar.org/611b/fa3b2cad2cec73cbfcb35a0b15b399449d55.pdf
[18]
Sylvain Cardin, Daniel Thalmann, and Frederic Vexo. 2007. A wearable system for mobility improvement of visually impaired people. Vis. Comput. 23, 2 (February 2007), 109–118.
[19]
Ruiqi Cheng, Kaiwei Wang, Kailun Yang, Ningbo Long, Weijian Hu, Hao Chen, Jian Bai, and Dong Liu. 2017. Crosswalk navigation for people with visual impairments on a wearable device. J. Electron. Imaging 26, 5 (September 2017).
[20]
Anne Lesley Corn and Jane N. Erin. 2010. Foundations of Low Vision: Clinical and Functional Perspectives.
[21]
Michael D. Crossland and Janet H. Silver. 2005. Thirty years in an urban low vision clinic: Changes in prescribing habits of low vision practitioners. Optom. Vis. Sci. 82, 7 (2005), 617–622.
[22]
Enox Software. 2017. OpenCV for Unity. Unity Asset Store. Retrieved from https://assetstore.unity.com/packages/tools/integration/opencv-for-unity-21088
[23]
S Ertan, G Lee, A Willets, H Tan, and A Pentland. 1998. A wearable haptic navigation guidance system. In SECOND INTERNATIONAL SYMPOSIUM ON WEARABLE COMPUTERS - DIGEST OF PAPERS, 164–165.
[24]
M R Everingham, B T Thomas, and T Troscianko. 1999. Head-mounted mobility aid for low vision using scene classification techniques. Int. J. Virtual Real. 3, 4 (1999), 3–12.
[25]
Mathieu Garon, Pierre Olivier Boulet, Jean Philippe Doironz, Luc Beaulieu, and Jean Francois Lalonde. 2017. Real-Time High Resolution 3D Data on the HoloLens. Adjun. Proc. 2016 IEEE Int. Symp. Mix. Augment. Reality, ISMAR-Adjunct 2016 October (2017), 189–191.
[26]
João Guerreiro, Dragan Ahmetovic, Daisuke Sato, Kris Kitani, and Chieko Asakawa. 2019. Airport Accessibility and Navigation Assistance for People with Visual Impairments. (2019), 1–14.
[27]
Shirin E Hassan, John C Hicks, Hao Lei, and Kathleen A Turano. 2007. What is the minimum field of view required for efficient navigation? Vision Res. 47, 16 (July 2007), 2115–2123.
[28]
Jennifer B. Hassell, E. L. Lamoureux, and J. E. Keeffe. 2006. Impact of age related macular degeneration on quality of life. Br. J. Ophthalmol. 90, 5 (2006), 593–596.
[29]
Else M. Havik, Aart C. Kooijman, and Frank J. J. M. Steyvers. 2019. The Effectiveness of Verbal Information Provided by Electronic Travel Aids for Visually Impaired Persons. J. Vis. Impair. Blind. 105, 10 (2019), 624–637.
[30]
S L Hicks, I Wilson, L Muhammed, J Worsfold, S M Downes, and C Kennard. 2013. A Depth-Based Head-Mounted Visual Display to Aid Navigation in Partially Sighted Individuals. PLoS One 8, 7 (2013).
[31]
A Hinds, A Sinclair, J Park, A Suttie, H Paterson, and M Macdonald. 2003. Impact of an interdisciplinary low vision service on the quality of life of low vision patients. Br. J. Ophthalmol. 87, 11 (2003), 1391–1396.
[32]
Hein Min Htike, Tom H. Margrain, Yu Kun Lai, and Parisa Eslambolchilar. 2020. Ability of head-mounted display technology to improve mobility in people with low vision: A systematic review. Transl. Vis. Sci. Technol. 9, 10 (2020), 1–27.
[33]
Jonathan Huang, Max Kinateder, Matt J. Dunn, Wojciech Jarosz, Xing Dong Yang, and Emily A. Cooper. 2019. An augmented reality sign-reading assistant for users with reduced vision. PLoS One 14, 1 (2019), 1–9.
[34]
Alex D Hwang and Eli Peli. 2014. An Augmented-Reality Edge Enhancement Application for Google Glass. Optom. Vis. Sci. 91, 8 (2014), 1021–1030.
[35]
Y Ikeda, E Suzuki, T Kuramata, T Kozaki, T Koyama, Y Kato, Y Murakami, H Enaida, and T Ishibashi. 2015. Development and evaluation of a visual aid using see-through display for patients with retinitis pigmentosa. Jpn. J. Ophthalmol. 59, 1 (2015), 43–47.
[36]
Volodymyr Ivanchenko, James Coughlan, and Huiying Shen. 2008. Crosswatch: a Camera Phone System for Orienting Visually Impaired Pedestrians at Traffic Intersections. Comput Help People Spec Needs 5105, (2008), 1122–1128.
[37]
William Henry Jacobson. 1993. The art and science of teaching orientation and mobility to persons with visual impairments. AFB Press.
[38]
Rabia Jafri, Rodrigo Louzada Campos, Syed Abid Ali, and Hamid R Arabnia. 2018. Visual and Infrared Sensor Data-Based Obstacle Detection for the Visually Impaired Using the Google Project Tango Tablet Development Kit and the Unity Engine. IEEE ACCESS 6, (2018), 443–454.
[39]
Rabia Jafri and Marwa Mahmoud Khan. 2018. User-centered design of a depth data based obstacle detection and avoidance system for the visually impaired. HUMAN-CENTRIC Comput. Inf. Sci. 8, (May 2018).
[40]
T Jones and To M Troscianko. 2006. Mobility performance of low-vision adults using an electronic mobility aid. Clin. Exp. Optom. 89, 1 (2006), 10–17.
[41]
Hernisa Kacorri, Sergio Mascetti, Andrea Gerino, Dragan Ahmetovic, Valeria Alampi, Hironobu Takagi, and Chieko Asakawa. 2018. Insights on assistive orientation and mobility of people with visual impairment based on large-scale longitudinal data. ACM Trans. Access. Comput. 11, 1 (2018).
[42]
Jon S. Karlsson. 1998. Self-reports of psychological distress in connection with various degrees of visual impairment. J. Vis. Impair. Blind. 92, 7 (1998), 483–490.
[43]
Seita Kayukawa and Kris Kitani. 2019. BBeep: A Sonic Collision Avoidance System for Blind Travellers and Nearby Pedestrians. Proc. 2019 CHI Conf. Hum. Factors Comput. Syst. (2019), 1–12.
[44]
Max Kinateder, Justin Gualtieri, Matt J. Dunn, Wojciech Jarosz, Xing-Dong Yang, and Emily A. Cooper. 2018. Using an Augmented Reality Device as a Distance-based Vision Aid-Promise and Limitations. Optom. Vis. Sci. 95, 9 (2018), 727–737.
[45]
Eunjeong Ko and Eun Yi Kim. 2017. A Vision-Based Wayfinding System for Visually Impaired People Using Situation Awareness and Activity-Based Instructions. SENSORS 17, 8 (August 2017).
[46]
T Kuyk, JL Elliott, and PS Fuhr. 1998. Visual correlates of obstacle avoidance in adults with low vision. Optom. Vis. Sci. 75, 3 (1998), 174–82.
[47]
T Kuyk, JL Elliott, and PS Fuhr. 1998. Visual correlates of mobility in real world settings in older adults with low vision. Optom. Vis. Sci. 75, 7 (1998), 538–547.
[48]
Thomas Kuyk and Jeffry L. Elliott. 1999. Visual factors and mobility in persons with age-related macular degeneration. J. Rehabil. Res. Dev. 36, 4 (1999), 303–312.
[49]
Ecosse L. Lamoureux, Jennifer B. Hassell, and Jill E. Keeffee. 2003. The determinants of participation in activities of daily living in people with impaired vision. Am. J. Ophthalmol. 137, 2 (2003), 265–270.
[50]
Ecosse L. Lamoureux, Julie F. Pallant, Konrad Pesudovs, Jennifer B. Hassell, and Jill E. Keeffe. 2006. The impact of vision impairment questionnaire: An evaluation of its measurement properties using Rasch analysis. Investig. Ophthalmol. Vis. Sci. 47, 11 (2006), 4732–4741.
[51]
Emma Marie Lethbridge and Chris Muldoon. 2018. Development of a Mobility-Related Quality-of-Life Measure for Individuals with Vision Impairments. J. Vis. Impair. Blind. 112, 2 (2018), 169–181.
[52]
Claudio Loconsole, Maryam Banitalebi Dehkordi, Edoardo Sotgiu, Marco Fontana, Massimo Bergamasco, and Antonio Frisoli. 2016. An IMU and RFID-based Navigation System Providing Vibrotactile Feedback for Visually Impaired People. In EuroHaptics.
[53]
Jan E. Lovie-Kitchin, Grace P. Soong, Shirin E. Hassan, and Russell L. Woods. 2010. Visual field size criteria for mobility rehabilitation referral. Optom. Vis. Sci. 87, 12 (2010), 948–957.
[54]
John B Lowe and Martin P Rubenstein. 2000. Distant telescopes: A survey of user success. Optom. Vis. Sci. 77, 5 (2000), 260–269. Retrieved from http://www.ncbi.nlm.nih.gov/htbin-post/Entrez/query?db=m&form=6&dopt=r&uid=10831216
[55]
Gang Luo and Eli; Peli. 2006. Use of an Augmented-Vision Device for Visual Search by Patients with Tunnel Vision. Invest. Ophthalmol. Vis. Sci. 47, 9 (2006), 4152–4159.
[56]
Gang Luo and Eli Peli. 2011. Development and evaluation of vision rehabilitation devices. In IEEE Engineering in Medicine and Biology Society, EMBS, 5228–5231.
[57]
Gang Luo, Russell L Woods, and Eli Peli. 2009. Collision judgment when using an augmented-vision head-mounted display device. Invest Ophthalmol Vis Sci 50, 9 (2009), 4509–4515.
[58]
M M Maassen, E Schwaderer, B Heinrich, S Herberhold, P S Mauz, and F Dammann. 2004. Comparison of the implantability of electronic hearing devices in a virtual reality planning environment and in human temporal bones. Acta Otolaryngol. 124, 9 (2004), 1039–1045.
[59]
Roberto Manduchi and Sri Kurniawan. 2011. Mobility-related accidents experienced by people with visual impairment. Insight Res. Pract. Vis. Impair. Blind. 4, 2 (2011), 44–54.
[60]
James A. Marron and Ian L. Bailey. 1982. Visual factors and orientation-mobility performance. Am. J. Optom. Physiol. Opt. 59, 5 (1982), 413–426.
[61]
Peter B. L. Meijer. 1992. An Experimental System for Audiory Image Representations.pdf. IEEE Trans. Biomed. Eng. 39, 2 (1992), 112–121.
[62]
Mei Miao, Martin Spindler, and Gerhard. Weber. 2021. Requirements of Indoor Navigation System from Blind Users. In Conference: Information Quality in e-Health.
[63]
Microsoft. HoloLens (1st gen) hardware. Retrieved from https://docs.microsoft.com/en-us/hololens/hololens1-hardware
[64]
Microsoft. Spatial Mapping. Retrieved from https://docs.microsoft.com/en-us/windows/mixed-reality/holograms-230
[65]
Microsoft. Locatable Camera - Mixed Reality. Retrieved from https://docs.microsoft.com/en-us/windows/mixed-reality/locatable-camera
[66]
Microsoft Corporation. 2020. Mixed Reality Toolkit Documentation. Microsoft Corporation. Retrieved from https://microsoft.github.io/MixedRealityToolkit-Unity/Documentation/GettingStartedWithTheMRTK.html
[67]
Bogdan Mocanu, Ruxandra Tapu, and Titus Zaharia. 2016. When ultrasonic sensors and computer vision join forces for efficient obstacle detection and recognition. Sensors (Switzerland) 16, 11 (2016).
[68]
Howard Moshtael, Tariq Aslam, Ian Underwood, and Baljean Dhillon. 2015. High Tech Aids Low Vision: A Review of Image Processing for the Visually Impaired. Transl. Vis. Sci. Technol. 4, 4 (July 2015), 6.
[69]
Dejing Ni, Aiguo Song, Lei Tian, Xiaonong Xu, and Danfeng Chen. 2015. A Walking Assistant Robotic System for the Visually Impaired Based on Computer Vision and Tactile Perception. Int. J. Soc. Robot. 7, 5 (November 2015), 617–628.
[70]
OpenCV. Canny Edge Detection. Retrieved from https://docs.opencv.org/3.4.10/da/d22/tutorial_py_canny.html
[71]
Hugo Paredes, Hugo Fernandes, Paulo Martins, and João Barroso. 2013. Gathering the users’ needs in the development of assistive technology: A blind navigation system use case. Lect. Notes Comput. Sci. (including Subser. Lect. Notes Artif. Intell. Lect. Notes Bioinformatics) 8011 LNCS, PART 3 (2013), 79–88.
[72]
M D Peláez-Coca, F Vargas-Martín, Sonia Mota, J Díaz, Eduardo Ros-Vidal, Maria Dolores Pelaez-Coca, Fernando Vargas-Martin, Sonia Mota, Javier Diaz, and Eduardo Ros-Vidal. 2009. A versatile optoelectronic aid for low vision patients. Ophthalmic Physiol. Opt. 29, 5 (2009), 565–572.
[73]
Eli Peli, Gang Luo, Alex Bowers, and Noa Rensing. 2009. Development and Evaluation of Vision Multiplexing Devices for Vision Impairments. Int. J. Artif. Intell. Tools 18, 3 (2009), 365–378.
[74]
E E Pissaloux, R Velazquez, and F Maingreaud. 2017. A New Framework for Cognitive Mobility of Visually Impaired Users in Using Tactile Device. IEEE Trans. Human-Machine Syst. 47, 6 (2017), 1040–1051.
[75]
Pablo Alejandro Quiñones, Tammy C. Greene, Rayoung Yang, and Mark W. Newman. 2011. Supporting visually impaired navigation: A needs-finding study. Conf. Hum. Factors Comput. Syst. - Proc. May 2011 (2011), 1645–1650.
[76]
C Ramer, T Lichtenegger, J Sessner, M Landgraf, and J Franke. 2016. An Adaptive, Color Based Lane Detection of a Wearable Jogging Navigation System for Visually Impaired on Less Structured Paths. In 2016 6TH IEEE INTERNATIONAL CONFERENCE ON BIOMEDICAL ROBOTICS AND BIOMECHATRONICS (BIOROB) (Proceedings of the IEEE RAS-EMBS International Conference on Biomedical Robotics and Biomechatronics), 741–746.
[77]
Joseph Redmon and Ali Farhadi. 2018. YOLOv3: An Incremental Improvement. arXiv (2018).
[78]
Joram J. van Rheede, Iain R. Wilson, Rose I. Qian, Susan M. Downes, Christopher Kennard, and Stephen L. Hicks. 2015. Improving Mobility Performance in Low Vision With a Distance-Based Representation of the Visual Scene. Investig. Ophthalmol. Vis. Sci. 56, 8 (2015), 4802–4809.
[79]
RNIB. 2019. Key information and statistics on sight loss in the UK. RNIB. Retrieved January 4, 2021 from http://www.rnib.org.uk/knowledge-and-research-hub/key-information-and-statistics
[80]
Leat S.J. and Lovie-Kitchin J.E. 2008. Visual Function, Visual Attention, and Mobility Performance in Low Vision. Optom. Vis. Sci. 85, 11 (2008), 1049–1056.
[81]
D Sato, U Oh, K Naito, H Takagi, K Kitani, and C Asakawa. 2017. NavCog3: An evaluation of a smartphone-based blindindoor navigation assistant with semantic features in a large-scale environment. In ASSETS 2017 - Proceedings of the 19th International ACM SIGACCESS Conference on Computers and Accessibility, 270–279.
[82]
Shraga Shoval, Johann Borenstein, and Yoram Koren. 1998. The NavBelt - A computerized travel aid for the blind based on mobile robotics technology. IEEE Trans. Biomed. Eng. 45, 11 (1998), 1376–1386.
[83]
W. C.S.S. Simoes and V. F. De Lucena. 2016. Blind user wearable audio assistance for indoor navigation based on visual markers and ultrasonic obstacle detection. 2016 IEEE Int. Conf. Consum. Electron. ICCE 2016 (2016), 60–63.
[84]
Janet P. Szlyk, Gerald A. Fishman, Sandeep Grover, Beatrise I. Revelins, and Deborah J. Derlacki. 1998. Difficulty in performing everyday activities in patients with juvenile macular dystrophies: Comparison with patients with retinitis pigmentosa. Br. J. Ophthalmol. 82, 12 (1998), 1372–1376.
[85]
Sarit Szpiro, Yuhang Zhao, and Shiri Azenkot. 2016. Finding a store, searching for a product: a study of daily challenges of low vision people. UbiComp ’16 (2016), 61–72.
[86]
Ruxandra Tapu, Bogdan Mocanu, and Titus Zaharia. 2015. Automatic Assistant for Better Mobility and Improved Cognition of Partially Sighted Persons. Adv. Electr. Comput. Eng. 15, 3 (2015), 45–52.
[87]
Ruxandra Tapu, Bogdan Mocanu, and Titus Zaharia. 2015. ALICE: A smartphone assistant used to increase the mobility of visual impaired people. J. Ambient Intell. Smart Environ. 7, 5 (2015), 659–678.
[88]
Virgil Tiponut, Sabin Ionel, Catalin-Daniel Caleanu, and Ioan Lie. 2007. Improved version of an integrated environment for assisted movement of visually impaired. In Proceedings - 11th WSEAS International Conference on Systems (Electrical and Computer Engineering Series), 87–91.
[89]
B.S. Tjan, P.J. Beckmann, R. Roy, N. Giudice, and G.E. Legge. 2006. Digital Sign System for Indoor Wayfinding for the Visually Impaired. (2006), 30–30.
[90]
Kathleen A. Turano, Aimee T. Broman, Karen Bandeen-Roche, Beatriz Munoz, Gary S. Rubin, and Sheila K. West. 2004. Association of visual field loss and mobility performance in older adults: Salisbury eye evaluation study. Optom. Vis. Sci. 81, 5 (2004), 298–307.
[91]
Kathleen A. Turano, Duane R. Geruschat, Julie W. Stahl, and Robert W. Massof. 1999. Perceived visual ability for independent mobility in persons with retinitis pigmentosa. Investig. Ophthalmol. Vis. Sci. 40, 5 (1999), 865–877.
[92]
Unity. Unity. Retrieved from https://unity.com/
[93]
Unity. Spatial Mapping Renderer. Retrieved from https://docs.unity3d.com/2018.3/Documentation/Manual/SpatialMappingRenderer.html
[94]
Unity. NavMesh. Retrieved from https://docs.unity3d.com/ScriptReference/AI.NavMesh.html
[95]
Fernando Vargas-martín, E L I Peli, The Schepens, Boston Ep, Laboratorio De Óptica, and Departamento De Física. 2002. Augmented-View for Restricted Visual Field: Multiple Device Implementations. Optom. Vis. Sci. 79, 11 (2002), 715–723.
[96]
Ramiro Velázquez, Edwige Pissaloux, Pedro Rodrigo, Miguel Carrasco, Nicola Giannoccaro, and Aimé Lay-Ekuakille. 2018. An Outdoor Navigation System for Blind Pedestrians Using GPS and Tactile-Foot Feedback. Appl. Sci. 8, 4 (2018), 578.
[97]
Gale R. Watson. 2001. Low vision in the geriatric population: Rehabilitation and management. Journal of the American Geriatrics Society 49, 317–330.
[98]
Michele A. Williams, Caroline Galbraith, Shaun K. Kane, and Amy Hurst. 2014. “Just Let the Cane Hit It.” (2014), 217–224.
[99]
Michele A. Williams, Amy Hurst, and Shaun K. Kane. 2013. “Pray Before You Step out”: Describing Personal and Situational Blind Navigation Behaviors. Proc. 15th Int. ACM SIGACCESS Conf. Comput. Access. - ASSETS’ 15 (2013), p28.
[100]
Cang Ye, Soonhac Hong, Xiangfei Qian, and Wei Wu. 2016. Co-Robotic Cane: A New Robotic Navigation Aid for the Visually Impaired. IEEE Syst. Man, Cybern. Mag. 2, 2 (2016), 33–42.
[101]
Yitzhak Yitzhaky and Liron Itan. 2013. Performance of visual tasks from contour information. J. Opt. Soc. Am. A, Opt. image Sci. Vis. 30, 3 (2013), 392–402.
[102]
Ola Younis, Waleed Al-Nuaimy, Mohammad H. Alomari, and Fiona Rowe. 2019. A hazard detection and tracking system for people with peripheral vision loss using smart glasses and augmented reality. Int. J. Adv. Comput. Sci. Appl. 10, 2 (2019), 1–9.
[103]
Yuhang Zhao, Michele Hu, Shafeka Hashash, and Shiri Azenkot. 2017. Understanding Low Vision People's Visual Perception on Commercial Augmented Reality Glasses. In Proceedings of the 2017 ACM SIGCHI Conference on Human Factors in Computing Systems (CHI’17), 4170–4181.
[104]
Yuhang Zhao, Elizabeth Kupferstein, Brenda Veronica Castro, Steven Feiner, and Shiri Azenkot. 2019. Designing AR visualizations to facilitate stair navigation for people with low vision. UIST 2019 - Proc. 32nd Annu. ACM Symp. User Interface Softw. Technol. (2019), 387–402.
[105]
Yuhang Zhao, Elizabeth Kupferstein, Hathaitorn Rojnirun, Leah Findlater, and Shiri Azenkot. 2020. The Effectiveness of Visual and Audio Wayfinding Guidance on Smartglasses for People with Low Vision. (2020), 1–14.
[106]
Yuhang Zhao, Elizabeth Kupferstein, Doron Tal, and Shiri Azenkot. 2018. “It Looks Beautiful but Scary:” How Low Vision People Navigate Stairs and Other Surface Level Changes. Proc. 20th Int. ACM SIGACCESS Conf. Comput. Access. - ASSETS ’18 (2018), 307–320.
[107]
Yuhang Zhao, Sarit Szpiro, Jonathan Knighten, and Shiri Azenkot. 2016. CueSee: Exploring Visual Cues for People with Low Vision to Facilitate a Visual Search Task. Proc. 2016 ACM Int. Jt. Conf. Pervasive Ubiquitous Comput. - UbiComp ’16 (2016), 73–84.
[108]
Yuhang Zhao, Cornell Tech, and Cornell Tech. 2015. ForeSee: A Customizable Head-Mounted Vision Enhancement System for People with Low Vision. In ASSETS ’15: Proceedings of the 17th International ACM SIGACCESS Conference on Computers & Accessibility, 239–249.

Cited By

View all
  • (2024)Functional performance comparison of long cane and secondary electronic travel aids for mobility enhancementBritish Journal of Visual Impairment10.1177/02646196241285098Online publication date: 8-Oct-2024
  • (2024)Dude, Where's My Luggage? An Autoethnographic Account of Airport Navigation by a Traveler with Residual VisionProceedings of the 26th International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3663548.3675624(1-13)Online publication date: 27-Oct-2024
  • (2024)Low Vision Boxing: Participatory Design of Adaptive Kickboxing Experiences with Low Vision PersonProceedings of the 26th International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3663548.3675619(1-18)Online publication date: 27-Oct-2024
  • Show More Cited By

Index Terms

  1. Augmented Reality Glasses as an Orientation and Mobility Aid for People with Low Vision: a Feasibility Study of Experiences and Requirements
        Index terms have been assigned to the content through auto-classification.

        Recommendations

        Comments

        Please enable JavaScript to view thecomments powered by Disqus.

        Information & Contributors

        Information

        Published In

        cover image ACM Conferences
        CHI '21: Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems
        May 2021
        10862 pages
        ISBN:9781450380966
        DOI:10.1145/3411764
        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

        Sponsors

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        Published: 07 May 2021

        Permissions

        Request permissions for this article.

        Check for updates

        Author Tags

        1. augmented reality
        2. low vision
        3. mobility aids
        4. vision enhancement

        Qualifiers

        • Research-article
        • Research
        • Refereed limited

        Conference

        CHI '21
        Sponsor:

        Acceptance Rates

        Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

        Upcoming Conference

        CHI 2025
        ACM CHI Conference on Human Factors in Computing Systems
        April 26 - May 1, 2025
        Yokohama , Japan

        Contributors

        Other Metrics

        Bibliometrics & Citations

        Bibliometrics

        Article Metrics

        • Downloads (Last 12 months)174
        • Downloads (Last 6 weeks)26
        Reflects downloads up to 05 Feb 2025

        Other Metrics

        Citations

        Cited By

        View all
        • (2024)Functional performance comparison of long cane and secondary electronic travel aids for mobility enhancementBritish Journal of Visual Impairment10.1177/02646196241285098Online publication date: 8-Oct-2024
        • (2024)Dude, Where's My Luggage? An Autoethnographic Account of Airport Navigation by a Traveler with Residual VisionProceedings of the 26th International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3663548.3675624(1-13)Online publication date: 27-Oct-2024
        • (2024)Low Vision Boxing: Participatory Design of Adaptive Kickboxing Experiences with Low Vision PersonProceedings of the 26th International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3663548.3675619(1-18)Online publication date: 27-Oct-2024
        • (2024)WatchCap: Improving Scanning Efficiency in People with Low Vision through Compensatory Head Movement StimulationProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36595928:2(1-32)Online publication date: 15-May-2024
        • (2024)Seeing Art Differently: Design Considerations to Improve Visual Art Engagement for People with Low VisionProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3660675(2820-2832)Online publication date: 1-Jul-2024
        • (2024)A Design Space for Vision Augmentations and Augmented Human Perception using Digital EyewearProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642380(1-16)Online publication date: 11-May-2024
        • (2024)An Overview of Mobility Awareness with Mobile Edge Computing over 6G Network: Challenges and Future Research DirectionsResults in Engineering10.1016/j.rineng.2024.102601(102601)Online publication date: Jul-2024
        • (2023)Enhancing Wayfinding Experience in Low-Vision Individuals through a Tailored Mobile Guidance InterfaceElectronics10.3390/electronics1222456112:22(4561)Online publication date: 7-Nov-2023
        • (2023)Using augmented reality to cue obstacles for people with low visionOptics Express10.1364/OE.47925831:4(6827)Online publication date: 9-Feb-2023
        • (2023)A systematic review of extended reality (XR) for understanding and augmenting vision lossJournal of Vision10.1167/jov.23.5.523:5(5)Online publication date: 4-May-2023
        • Show More Cited By

        View Options

        Login options

        View options

        PDF

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader

        HTML Format

        View this article in HTML Format.

        HTML Format

        Figures

        Tables

        Media

        Share

        Share

        Share this Publication link

        Share on social media