Nothing Special   »   [go: up one dir, main page]

skip to main content
research-article
Public Access

Corridor-Walker: Mobile Indoor Walking Assistance for Blind People to Avoid Obstacles and Recognize Intersections

Published: 20 September 2022 Publication History

Abstract

Navigating in an indoor corridor can be challenging for blind people as they have to be aware of obstacles while also having to recognize the intersections that lead to the destination. To aid blind people in such tasks, we propose Corridor-Walker, a smartphone-based system that assists blind people to avoid obstacles and recognize intersections. The system uses a LiDAR sensor equipped with a smartphone to construct a 2D occupancy grid map of the surrounding environment. Then, the system generates an obstacle-avoiding path and detects upcoming intersections on the grid map. Finally, the system navigates the user to trace the generated path and notifies the user of each intersection's existence and the shape using vibration and audio feedback. A user study with 14 blind participants revealed that Corridor-Walker allowed participants to avoid obstacles, rely less on the wall to walk straight, and enable them to recognize intersections.

Supplementary Material

MP4 File (v6mhci179.mp4)
Supplemental video

References

[1]
Dragan Ahmetovic, Federico Avanzini, Adriano Baratè, Cristian Bernareggi, Gabriele Galimberti, Luca A Ludovico, Sergio Mascetti, and Giorgio Presti. 2019. Sonifcation of rotation instructions to support navigation of people with visual impairment. In 2019 IEEE International Conference on Pervasive Computing and Communications (PerCom. IEEE, Los Alamitos, CA, USA, 1--10. https://doi.org/10.1109/PERCOM.2019.8767407
[2]
Abdulrhman Alkhanifer and S. Ludi. 2015. Disorientation Factors that Afect the Situation Awareness of the Visually Impaired Individuals in Unfamiliar Indoor Environments. In HCI. Springer, New York, NY, USA, 89--100. https: //doi.org/10.1007/978--3--319--20687--5_9
[3]
Apple. 2021. iPhone 12 Pro - Technical Specifcations. Retrieved in January 17, 2022 from https://support.apple.com/ kb/SP831?locale=en_US.
[4]
Shiri Azenkot, Richard E Ladner, and Jacob O Wobbrock. 2011. Smartphone haptic feedback for nonvisual wayfnding. In The proceedings of the 13th international ACM SIGACCESS conference on Computers and accessibility. ACM, New York, NY, USA, 281--282. https://doi.org/10.1145/2049536.2049607
[5]
Carryl L Baldwin, Jesse L Eisert, Andre Garcia, Bridget Lewis, Stephanie M Pratt, and Christian Gonzalez. 2012. Multimodal urgency coding: auditory, visual, and tactile parameters and their impact on perceived urgency. Work 41, Supplement 1 (2012), 3586--3591.
[6]
Jefrey R Blum, Mathieu Bouchard, and Jeremy R Cooperstock. 2011. What's around me? Spatialized audio augmented reality for blind users with a smartphone. In International Conference on Mobile and Ubiquitous Systems: Computing, Networking, and Services. Springer, New York, NY, USA, 49--62. https://doi.org/10.1007/978--3--642--30973--1_5
[7]
Nicholas A Bradley and Mark D Dunlop. 2002. Investigating context-aware clues to assist navigation for visually impaired people. In Proceedings of Workshop on Building Bridges: Interdisciplinary Context-Sensitive Computing, University of Glasgow. Strathprints, GBR.
[8]
Anke Brock and Christophe Joufrais. 2015. Interactive audio-tactile mapsfor visually impaired people. ACM SIGACCESS Accessibility and Computing 113 (2015), 3--12. https://doi.org/10.1145/2850440.2850441
[9]
John Brooke et al. 1996. SUS-A quick and dirty usability scale. Usability evaluation in industry 189, 194 (1996), 4--7. https://doi.org/10.1201/9781498710411--35
[10]
Yi Cheng and Gong Ye Wang. 2018. Mobile robot navigation based on lidar. In 2018 Chinese Control And Decision Conference (CCDC). IEEE, San Francisco, CA, USA, 1243--1246. https://doi.org/10.1109/CCDC.2018.8407319
[11]
Cloudsight. 2021. TapTapSee. Mobile camera application designed specifcally for the blind and visually impaired iOS users. Retrieved in September 9, 2021 from http://www.taptapseeapp.com.
[12]
Angela Constantinescu, Karin Müller, Monica Haurilet, Vanessa Petrausch, and Rainer Stiefelhagen. 2020. Bring the Environment to Life: A Sonifcation Module for People with Visual Impairments to Improve Situation Awareness. In Proceedings of the 2020 International Conference on Multimodal Interaction. ACM, New York, NY, USA, 50--59. https://doi.org/10.1145/3382507.3418874
[13]
Apple Developer. 2021. ARKit. Retrieved in September 9, 2021 from https://www.microsoft.com/en-us/research/ product/soundscape/.
[14]
Apple Developer. 2021. Displaying a Point Cloud Using Scene Depth. Retrieved in September, 2021 from https: //developer.apple.com/documentation/arkit/environmental_analysis/displaying_a_point_cloud_using_scene_depth.
[15]
Andrés A Díaz-Toro, Sixto E Campaña-Bastidas, and Eduardo F Caicedo-Bravo. 2021. Vision-Based System for Assisting Blind People to Wander Unknown Environments in a Safe Way. Journal of Sensors 2021 (2021). https: //doi.org/10.1155/2021/6685686
[16]
Christin Engel, Karin Müller, Angela Constantinescu, Claudia Loitsch, Vanessa Petrausch, Gerhard Weber, and Rainer Stiefelhagen. 2020. Travelling More Independently: A Requirements Analysis for Accessible Journeys to Unknown Buildings for People with Visual Impairments. In ASSETS '20. ACM, New York, NY, USA. https://doi.org/10.1145/ 3373625.3417022
[17]
Agebson Rocha Façanha, Ticianne Darin, Windson Viana, and Jaime Sánchez. 2020. O&M Indoor Virtual Environments for People Who Are Blind: A Systematic Literature Review. ACM Transactions on Accessible Computing (TACCESS) 13, 2 (2020), 1--42. https://doi.org/10.1145/3395769
[18]
Navid Fallah, Ilias Apostolopoulos, Kostas Bekris, and Eelke Folmer. 2012. The user as a sensor: navigating users with visual impairments in indoor spaces using tactile landmarks. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, New York, NY, USA, 425--432. https://doi.org/10.1145/2207676.2207735
[19]
Alexander Fiannaca, Ilias Apostolopoulous, and Eelke Folmer. 2014. Headlock: a wearable navigation aid that helps blind cane users traverse large open spaces. In Proceedings of the 16th international ACM SIGACCESS conference on Computers & accessibility. ACM, New York, NY, USA, 19--26. https://doi.org/10.1145/2661334.2661453
[20]
MartinAFischler and Robert C Bolles. 1981. Random sample consensus: a paradigm for model ftting with applicationsto image analysis and automated cartography. Commun. ACM 24, 6 (1981), 381--395. https://doi.org/10.1145/358669.358692
[21]
Adriano Garcia, Edward Mattison, and Kanad Ghose. 2015. High-speed vision-based autonomous indoor navigation of a quadcopter. In 2015 international conference on unmanned aircraft systems (ICUAS). IEEE, Los Alamitos, CA, USA, 338--347. https://doi.org/10.1109/ICUAS.2015.7152308
[22]
Adriano Garcia, Sandeep S Mittal, Edward Kiewra, and Kanad Ghose. 2019. A convolutional neural network vision system approach to indoor autonomous quadrotor navigation. In 2019 International Conference on Unmanned Aircraft Systems (ICUAS). IEEE, Los Alamitos, CA, USA, 1344--1352. https://doi.org/10.1109/ICUAS.2019.8798183
[23]
Nicholas A Giudice. 2018. Navigating without vision: Principles of blind spatial cognition. In Handbook of behavioral and cognitive geography. Edward Elgar Publishing, Northampton, MA, USA. https://doi.org/10.4337/9781784717544.00024
[24]
William Grussenmeyer, Jesel Garcia, and Fang Jiang. 2016. Feasibility of using haptic directions through maps with a tablet and smart watch for people who are blind and visually impaired. In Proceedings of the 18th International Conference on Human-Computer Interaction with Mobile Devices and Services. ACM, New York, NY, USA, 83--89. https://doi.org/10.1145/2935334.2935367
[25]
João Guerreiro, Dragan Ahmetovic, Kris M Kitani, and Chieko Asakawa. 2017. Virtual navigation for blind people: Building sequential representations of the real-world. In Proceedings of the 19th International ACM SIGACCESS Conference on computers and accessibility. ACM, New York, NY, USA, 280--289. https://doi.org/10.1145/3132525.3132545
[26]
João Guerreiro, Daisuke Sato, Saki Asakawa, Huixu Dong, Kris M Kitani, and Chieko Asakawa. 2019. CaBot: Designing and Evaluating an Autonomous Navigation Robot for Blind People. In The 21st International ACM SIGACCESS Conference on Computers and Accessibility. ACM, New York, NY, USA, 68--82. https://doi.org/10.1145/3308561.3353771
[27]
João Guerreiro, Eshed Ohn-Bar, Dragan Ahmetovic, Kris Kitani, and Chieko Asakawa. 2018. How Context and User Behavior Afect Indoor Navigation Assistance for Blind People. In Proceedings of the 15th International Web for All Conference. ACM, New York, NY, USA, 4 pages. https://doi.org/10.1145/3192714.3192829
[28]
Peter E Hart, Nils J Nilsson, and Bertram Raphael. 1968. A formal basis for the heuristic determination of minimum cost paths. IEEE transactions on Systems Science and Cybernetics 4, 2 (1968), 100--107. https://doi.org/10.1109/TSSC. 1968.300136
[29]
Leona Holloway, Kim Marriott, and Matthew Butler. 2018. Accessible maps for the blind: Comparing 3D printed models with tactile graphics. In Proceedings of the 2018 chi conference on human factors in computing systems. ACM, New York, NY, USA, 1--13. https://doi.org/10.1145/3173574.3173772
[30]
Dirk Holz, Stefan Holzer, Radu Bogdan Rusu, and Sven Behnke. 2011. Real-time plane segmentation using RGB-D cameras. In Robot Soccer World Cup. Springer, New York, NY, USA, 306--317. https://doi.org/10.1007/978--3--642--32060- 6_26
[31]
Gesu India, Mohit Jain, Pallav Karya, Nirmalendu Diwakar, and Manohar Swaminathan. 2021. VStroll: An audio-based virtual exploration to encourage walking among people with vision impairments. In The 23rd International ACM SIGACCESS Conference on Computers and Accessibility. ACM, New York, NY, USA, 1--13. https://doi.org/10.1145/ 3441852.3471206
[32]
Chandrika Jayant, Hanjie Ji, Samuel White, and Jefrey P Bigham. 2011. Supporting blind photography. In The proceedings of the 13th international ACM SIGACCESS conference on Computers and accessibility. ACM, New York, NY, USA, 203--210. https://doi.org/10.1145/2049536.2049573
[33]
Watthanasak Jeamwatthanachai, M. Wald, and G. Wills. 2019. Indoor navigation by blind people: Behaviors and challenges in unfamiliar spaces and buildings. The British Journal of Visual Impairment 37 (2019), 140 -- 153. https: //doi.org/10.1177/0264619619833723
[34]
Robert K Katzschmann, Brandon Araki, and Daniela Rus. 2018. Safe local navigation for visually impaired users with a time-of-fight and haptic feedback device. IEEE Transactions on Neural Systems and Rehabilitation Engineering 26, 3 (2018), 583--593. https://doi.org/10.1109/TNSRE.2018.2800665
[35]
Seita Kayukawa, Keita Higuchi, João Guerreiro, Shigeo Morishima, Yoichi Sato, Kris Kitani, and Chieko Asakawa. 2019. BBeep: A sonic collision avoidance system for blind travellers and nearby pedestrians. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. ACM, New York, NY, USA, 1--12. https://doi.org/10.1145/3290605. 3300282
[36]
Seita Kayukawa, Tatsuya Ishihara, Hironobu Takagi, Shigeo Morishima, and Chieko Asakawa. 2020. BlindPilot: A Robotic Local Navigation System That Leads Blind People to a Landmark Object. In Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems. ACM, New York, NY, USA, 1--9. https://doi.org/10.1145/3334480. 3382925
[37]
Seita Kayukawa, Tatsuya Ishihara, Hironobu Takagi, Shigeo Morishima, and Chieko Asakawa. 2020. Guiding Blind Pedestrians in Public Spaces by Understanding Walking Behavior of Nearby Pedestrians. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 4, 3 (2020), 1--22. https://doi.org/10.1145/3411825
[38]
Seita Kayukawa, Hironobu Takagi, João Guerreiro, Shigeo Morishima, and Chieko Asakawa. 2020. Smartphone-Based Assistance for Blind People to Stand in Lines. In Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems. ACM, New York, NY, USA, 1--8. https://doi.org/10.1145/3334480.3382954
[39]
Seongho Kim, Taeyeon Kim, Choong Sun Kim, Hyeongdo Choi, Yong Jun Kim, Gyu Soup Lee, Ockkyun Oh, and Byung Jin Cho. 2020. Two-Dimensional Thermal Haptic Module Based on a Flexible Thermoelectric Device. Soft Robotics 7, 6 (2020), 736--742. https://doi.org/10.1089/soro.2019.0158
[40]
Masaki Kuribayashi, Seita Kayukawa, Daisuke Sato, Chieko Asakawa, Hironobu Takagi, and Shigeo Morishima. 2021. Designing a Smartphone-Based Assistance System for Blind People to Recognize Intersections and Obstacles in Indoor Corridors. In Proceedings of the 18th EAI International Conference on Mobile and Ubiquitous Systems: Computing, Networking and Services. Springer, New York, NY, USA, 3 pages.
[41]
Masaki Kuribayashi, Seita Kayukawa, Hironobu Takagi, Chieko Asakawa, and Shigeo Morishima. 2021. LineChaser: A Smartphone-Based Navigation System for Blind People to Stand in Lines. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. ACM, New York, NY, USA, 1--13. https://doi.org/10.1145/3411764.3445451
[42]
Gerard Lacey and Shane MacNamara. 2000. Context-aware shared control of a robot mobility aid for the elderly blind. The International Journal of Robotics Research 19, 11 (2000), 1054--1065. https://doi.org/10.1177/02783640022067968
[43]
Young Hoon Lee and Gérard Medioni. 2011. RGB-D camera based navigation for the visually impaired. In Proceedings of the RSS, Vol. 2. Citeseer, Pennsylvania State University, University Park, USA. https://doi.org/10.1016/j.cviu.2016.03.019
[44]
Young Hoon Lee and Gerard Medioni. 2014. Wearable RGBD indoor navigation system for the blind. In European Conference on Computer Vision. Springer, New York, NY, USA, 493--508. https://doi.org/10.1007/978--3--319--16199-0_35
[45]
Bridget A Lewis, Jesse L Eisert, and Carryl L Baldwin. 2014. Efect of tactile location, pulse duration, and interpulse interval on perceived urgency. Transportation research record 2423, 1 (2014), 10--14.
[46]
Bing Li, J Pablo Munoz, Xuejian Rong, Jizhong Xiao, Yingli Tian, and Aries Arditi. 2016. ISANA: wearable context-aware indoor assistive navigation with obstacle avoidance for the blind. In European Conference on Computer Vision. Springer, New York, NY, USA, 448--462. https://doi.org/10.1007/978--3--319--48881--3_31
[47]
Jacobus C Lock, Iain D Gilchrist, Iain D Gilchrist, Grzegorz Cielniak, and Nicola Bellotto. 2020. Experimental Analysis of a Spatialised Audio Interface for People with Visual Impairments. ACM Transactions on Accessible Computing (TACCESS) 13, 4 (2020), 1--21. https://doi.org/10.1145/3412325
[48]
Jack M Loomis, Reginald G Golledge, and Roberta L Klatzky. 1998. Navigation system for the blind: Auditory display modes and guidance. Presence 7, 2 (1998), 193--203. https://doi.org/10.1162/105474698565677
[49]
Chen-Lung Lu, Zi-Yan Liu, Jui-Te Huang, Ching-I Huang, Bo-Hui Wang, Yi Chen, Nien-Hsin Wu, Hsueh-Cheng Wang, Laura Giarré, and Pei-Yi Kuo. 2021. Assistive Navigation Using Deep Reinforcement Learning Guiding Robot With UWB/Voice Beacons and Semantic Feedbacks for Blind and Visually Impaired People. Frontiers in Robotics and AI 8 (2021), 1--23 pages. https://doi.org/10.3389/frobt.2021.654132
[50]
Manuel Martinez, Angela Constantinescu, Boris Schauerte, Daniel Koester, and Rainer Stiefelhagen. 2014. Cognitive evaluation of haptic and audio feedback in short range navigation tasks. In International Conference on Computers for Handicapped Persons. Springer, New York, NY, USA, 128--135. https://doi.org/10.1007/978--3--319-08599--9_20
[51]
Natalina Martiniello, Werner Eisenbarth, Christine Lehane, Aaron Johnson, and Walter Wittich. 2019. Exploring the use of smartphones and tablets among people with visual impairments: Are mainstream devices replacing the use of traditional visual aids? Assistive Technology (2019), 1--12. https://doi.org/10.1080/10400435.2019.1682084
[52]
Microsoft. 2021. Seeing AI. A free app that narrates the world around you. Retrieved in September 9, 2021 from https://www.microsoft.com/en-us/seeing-ai.
[53]
Microsoft. 2021. SoundScape. Retrieved in September 9, 2021 from https://www.microsoft.com/en-us/research/product/ soundscape/.
[54]
John Morris and James Mueller. 2014. Blind and deaf consumer preferences for android and iOS smartphones. In Inclusive designing. Springer, New York, NY, USA, 69--79. https://doi.org/10.1007/978--3--319-05095--9_7
[55]
Masaki Nakamiya, Yasue Kishino, Tsutomu Terada, and Shojiro Nishio. 2007. A Route Planning Method Using Cost Map for Mobile Sensor Nodes. In 2007 2nd International Symposium on Wireless Pervasive Computing. IEEE, Los Alamitos, CA, USA. https://doi.org/10.1109/ISWPC.2007.342595
[56]
Arshad Nasser, Kai-Ning Keng, and Kening Zhu. 2020. Thermalcane: Exploring thermotactile directional cues on canegrip for non-visual navigation. In The 22nd International ACM SIGACCESS Conference on Computers and Accessibility. ACM, New York, NY, USA, 1--12. https://doi.org/10.1145/3373625.3417004
[57]
Jagannadh Pariti, Vinita Tibdewal, and Tae Oh. 2020. Intelligent Mobility Cane-Lessons Learned from Evaluation of Obstacle Notifcation System using a Haptic Approach. In Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems. ACM, New York, NY, USA, 1--8. https://doi.org/10.1145/3334480.3375217
[58]
Benjamin Poppinga, Charlotte Magnusson, Martin Pielot, and Kirsten Rassmus-Gröhn. 2011. TouchOver map: audiotactile exploration of interactive maps. In Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services. ACM, New York, NY, USA, 545--550. https://doi.org/10.1145/2037373.2037458
[59]
Vivek Pradeep, Gerard Medioni, and James Weiland. 2010. Robot vision for the visually impaired. In 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition-Workshops. IEEE, San Francisco, CA, USA, 15--22. https://doi.org/10.1109/CVPRW.2010.5543579
[60]
Giorgio Presti, Dragan Ahmetovic, Mattia Ducci, Cristian Bernareggi, Luca Ludovico, Adriano Baratè, Federico Avanzini, and Sergio Mascetti. 2019. WatchOut: Obstacle sonifcation for people with visual impairment or blindness. In The 21st International ACM SIGACCESS Conference on Computers and Accessibility. ACM, New York, NY, USA, 402--413. https://doi.org/10.1145/3308561.3353779
[61]
Joseph Redmon and Ali Farhadi. 2018. YOLOv3: An Incremental Improvement. arXiv:1804.02767 [cs.CV]
[62]
Jillian M Rickly, Nigel Halpern, Marcus Hansen, and John Welsman. 2021. Travelling with a Guide Dog: Experiences of People with Vision Impairment. Sustainability 13, 5 (2021), 2840.
[63]
Alberto Rodríguez, J Javier Yebes, Pablo F Alcantarilla, Luis M Bergasa, Javier Almazán, and Andrés Cela. 2012. Assisting the visually impaired: obstacle detection and warning system by acoustic feedback. Sensors 12, 12 (2012), 17476--17496. https://doi.org/10.3390/s121217476
[64]
Pannag R Sanketi and James M Coughlan. 2010. Anti-blur feedback for visually impaired users of smartphone cameras. In Proceedings of the 12th international ACM SIGACCESS conference on Computers and accessibility. ACM, New York, NY, USA, 233--234. https://doi.org/10.1145/1878803.1878847
[65]
Daisuke Sato, Uran Oh, João Guerreiro, Dragan Ahmetovic, Kakuya Naito, Hironobu Takagi, Kris M Kitani, and Chieko Asakawa. 2019. NavCog3 in the wild: Large-scale blind indoor navigation assistant with semantic features. ACM Transactions on Accessible Computing (TACCESS) 12, 3 (2019), 14. https://doi.org/10.1145/3340319
[66]
Victor R Schinazi, Tyler Thrash, and Daniel-Robert Chebat. 2016. Spatial navigation by congenitally blind individuals. Wiley Interdisciplinary Reviews: Cognitive Science 7, 1 (2016), 37--58. https://doi.org/10.1002/wcs.1375
[67]
Kristen Shinohara and Jacob O Wobbrock. 2011. In the shadow of misperception: assistive technology use and social interactions. In Proceedings of the SIGCHI conference on human factors in computing systems. ACM, New York, NY, USA, 705--714. https://doi.org/10.1145/1978942.1979044
[68]
Julian Striegl, Claudia Lotisch, Jan Schmalfuss-Schwarz, and G. Weber. 2020. Analysis of Indoor Maps Accounting the Needs of People with Impairments. In ICCHP, Vol. 12377. Springer, New York, NY, USA. https://doi.org/10.1007/978--3- 030--58805--2_36
[69]
Carolyn Ton, Abdelmalak Omar, Vitaliy Szedenko, Viet Hung Tran, Alina Aftab, Fabiana Perla, Michael J. Bernstein, and Yi Yang. 2018. LIDAR Assist Spatial Sensing for the Visually Impaired and Performance Analysis. IEEE Transactions on Neural Systems and Rehabilitation Engineering 26, 9 (2018), 1727--1734. https://doi.org/10.1109/TNSRE.2018.2859800
[70]
Nelson Daniel Troncoso Aldas, Sooyeon Lee, Chonghan Lee, Mary Beth Rosson, John M Carroll, and Vijaykrishnan Narayanan. 2020. AIGuide: An Augmented Reality Hand Guidance Application for People with Visual Impairments. In The 22nd International ACM SIGACCESS Conference on Computers and Accessibility. ACM, New York, NY, USA, 1--13.https://doi.org/10.1145/3373625.3417028
[71]
Hsueh-Cheng Wang, Robert K Katzschmann, Santani Teng, Brandon Araki, Laura Giarré, and Daniela Rus. 2017. Enabling independent navigation for visually impaired people through a wearable vision-based feedback system. In 2017 IEEE international conference on robotics and automation (ICRA). IEEE, Los Alamitos, CA, USA, 6533--6540. https://doi.org/10.1109/ICRA.2017.7989772
[72]
Lorraine Whitmarsh. 2005. The benefts of guide dog ownership. Visual impairment research 7, 1 (2005), 27--42. https://doi.org/10.1080/13882350590956439
[73]
Jiongtao Xiong, Yijun Liu, Xiangrong Ye, Long Han, Huihuan Qian, and Yangsheng Xu. 2016. A hybrid lidar-based indoor navigation system enhanced by ceiling visual codes for mobile robots. In 2016 IEEE International Conference on Robotics and Biomimetics (ROBIO). IEEE, San Francisco, CA, USA, 1715--1720. https://doi.org/10.1109/ROBIO.2016.7866575
[74]
Yutaro Yamanaka, Seita Kayukawa, Hironobu Takagi, Yuichi Nagaoka, Yoshimune Hiratsuka, and Satoshi Kurihara. 2021. One-Shot Wayfnding Method for Blind People via OCR and Arrow Analysis with a 360-degree Smartphone Camera. In Proceedings of the 18th EAI International Conference on Mobile and Ubiquitous Systems: Computing, Networking and Services. Springer, New York, NY, USA, 19 pages.
[75]
Chris Yoon, Ryan Louie, Jeremy Ryan, MinhKhang Vu, Hyegi Bang, William Derksen, and Paul Ruvolo. 2019. Leveraging Augmented Reality to Create Apps for People with Visual Disabilities: A Case Study in Indoor Navigation. In The 21st International ACM SIGACCESS Conference on Computers and Accessibility. ACM, New York, NY, USA, 210--221. https://doi.org/10.1145/3308561.3353788

Cited By

View all
  • (2024)Real-Time View Assistance for the Blind using Image ProcessingJournal of Innovative Image Processing10.36548/jiip.2024.2.0026:2(96-109)Online publication date: Jun-2024
  • (2024)Arduino-Based Sensor System for Safe Mobility of People with Visual Impairments2024 35th Conference of Open Innovations Association (FRUCT)10.23919/FRUCT61870.2024.10516356(267-274)Online publication date: 24-Apr-2024
  • (2024)Review—Innovations in Flexible Sensory Devices for the Visually ImpairedECS Journal of Solid State Science and Technology10.1149/2162-8777/ad658813:7(077011)Online publication date: 30-Jul-2024
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image Proceedings of the ACM on Human-Computer Interaction
Proceedings of the ACM on Human-Computer Interaction  Volume 6, Issue MHCI
MHCI
September 2022
852 pages
EISSN:2573-0142
DOI:10.1145/3564624
Issue’s Table of Contents
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 20 September 2022
Published in PACMHCI Volume 6, Issue MHCI

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. intersection detection
  2. obstacle avoidance
  3. orientation and mobility
  4. visual impairment

Qualifiers

  • Research-article

Funding Sources

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)474
  • Downloads (Last 6 weeks)46
Reflects downloads up to 13 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Real-Time View Assistance for the Blind using Image ProcessingJournal of Innovative Image Processing10.36548/jiip.2024.2.0026:2(96-109)Online publication date: Jun-2024
  • (2024)Arduino-Based Sensor System for Safe Mobility of People with Visual Impairments2024 35th Conference of Open Innovations Association (FRUCT)10.23919/FRUCT61870.2024.10516356(267-274)Online publication date: 24-Apr-2024
  • (2024)Review—Innovations in Flexible Sensory Devices for the Visually ImpairedECS Journal of Solid State Science and Technology10.1149/2162-8777/ad658813:7(077011)Online publication date: 30-Jul-2024
  • (2024)AngleSizer: Enhancing Spatial Scale Perception for the Visually Impaired with an Interactive Smartphone AssistantProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36785258:3(1-31)Online publication date: 9-Sep-2024
  • (2024)"We are at the mercy of others' opinion": Supporting Blind People in Recreational Window Shopping with AI-infused TechnologyProceedings of the 21st International Web for All Conference10.1145/3677846.3677860(45-58)Online publication date: 13-May-2024
  • (2024)Snap&Nav: Smartphone-based Indoor Navigation System For Blind People via Floor Map Analysis and Intersection DetectionProceedings of the ACM on Human-Computer Interaction10.1145/36765228:MHCI(1-22)Online publication date: 24-Sep-2024
  • (2024)WorldScribe: Towards Context-Aware Live Visual DescriptionsProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676375(1-18)Online publication date: 13-Oct-2024
  • (2024)A Review of Intelligent Walking Support Robots: Aiding Sit-to-Stand Transition and WalkingIEEE Transactions on Neural Systems and Rehabilitation Engineering10.1109/TNSRE.2024.337945332(1355-1369)Online publication date: 2024
  • (2024)MIM: Indoor and Outdoor Navigation in Complex Environments Using Multi-Layer Intensity Maps2024 IEEE International Conference on Robotics and Automation (ICRA)10.1109/ICRA57147.2024.10610673(10917-10924)Online publication date: 13-May-2024
  • (2024)Using Portable Virtual Reality to Assess Mobility of Blind and Low-Vision Individuals With the Audomni Sensory Supplementation FeedbackIEEE Access10.1109/ACCESS.2024.336680812(26222-26241)Online publication date: 2024
  • Show More Cited By

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Login options

Full Access

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media