Nothing Special   »   [go: up one dir, main page]

skip to main content
research-article

Virtual navigation for blind people: : Transferring route knowledge to the real-World

Published: 01 March 2020 Publication History

Highlights

Blind people can use virtual navigation to quickly learn real-world short routes.
Most users gained comprehensive knowledge of all routes within three sessions.
Virtual navigation allowed users to complete 60-meter real-world routes unassisted.
When receiving in-situ navigation assistance users rely less on prior knowledge.
Some users took advantage of the knowledge acquired virtually to recover from errors.

Graphical abstract

Display Omitted

Abstract

Independent navigation is challenging for blind people, particularly in unfamiliar environments. Navigation assistive technologies try to provide additional support by guiding users or increasing their knowledge of the surroundings, but accurate solutions are still not widely available. Based on this limitation and on the fact that spatial knowledge can also be acquired indirectly (prior to navigation), we developed an interactive virtual navigation app where users can learn unfamiliar routes before physically visiting the environment. Our main research goals are to understand the acquisition of route knowledge through smartphone-based virtual navigation and how it evolves over time; its ability to support independent, unassisted real-world navigation of short routes; and its ability to improve user performance when using an accurate in-situ navigation tool (NavCog). With these goals in mind, we conducted a user study where 14 blind participants virtually learned routes at home for three consecutive days and then physically navigated them, both unassisted and with NavCog. In virtual navigation, we analyzed the evolution of route knowledge and we found that participants were able to quickly learn shorter routes and gradually increase their knowledge in both short and long routes. In the real-world, we found that users were able to take advantage of this knowledge, acquired completely through virtual navigation, to complete unassisted navigation tasks. When using NavCog, users tend to rely on the navigation system and less on their prior knowledge and therefore virtual navigation did not significantly improve users’ performance.

References

[1]
A. Abdolrahmani, W. Easley, M. Williams, S. Branham, A. Hurst, Embracing errors: examining how context of use impacts blind individuals’ acceptance of navigation aid errors, Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, ACM, 2017, pp. 4158–4169.
[2]
A. Abdolrahmani, R. Kuber, A. Hurst, An empirical investigation of the situationally-induced impairments experienced by blind mobile device users, Proceedings of the 13th Web for All Conference, ACM, 2016, p. 21.
[3]
D. Ahmetovic, C. Gleason, C. Ruan, K. Kitani, H. Takagi, C. Asakawa, Navcog: a navigational cognitive assistant for the blind, Proceedings of the 18th International Conference on Human-Computer Interaction with Mobile Devices and Services, ACM, 2016, pp. 90–99.
[4]
D. Ahmetovic, J. Guerreiro, E. Ohn-Bar, K. Kitani, C. Asakawa, Impact of expertise on interaction preferences for navigation assistance of visually impaired individuals, Proceedings of the 16th Web for All Conference, 2019.
[5]
D. Ahmetovic, M. Murata, C. Gleason, E. Brady, H. Takagi, K. Kitani, C. Asakawa, Achieving practical and accurate indoor navigation for people with visual impairments, Proceedings of the 14th Web for All Conference on The Future of Accessible Work, ACM, 2017, p. 31.
[6]
A. Aladren, G. López-Nicolás, L. Puig, J.J. Guerrero, Navigation assistance for the visually impaired using rgb-d sensor with range expansion, IEEE Syst. J. 10 (3) (2016) 922–932.
[7]
APH, 2018. Nearby explorer. Retrieved in June, 2019 from http://www.aph.org/nearby-explorer/.
[8]
N. Banovic, R.L. Franz, K.N. Truong, J. Mankoff, A.K. Dey, Uncovering information needs for independent spatial learning for users who are visually impaired, Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility, ACM, 2013, p. 24.
[9]
M. Blades, R. Golledge, R. Jacobson, R. Kitchin, Y. Lippa, The effect of spatial tasks on visually impaired peoples’ wayfinding abilities, J. Visual Impairm. Blind. (JVIB) 96 (06) (2002).
[10]
BlindSquare, 2018. Blindsquare ios application. Retrieved in June, 2019 from http://blindsquare.com/.
[11]
J.R. Blum, M. Bouchard, J.R. Cooperstock, What’s around me? Spatialized audio augmented reality for blind users with a smartphone, International Conference on Mobile and Ubiquitous Systems: Computing, Networking, and Services, Springer, 2011, pp. 49–62.
[12]
A. Brock, Interactive Maps for Visually Impaired People: Design, Usability and Spatial Cognition, Université Toulouse 3 Paul Sabatier, 2013, Ph.D. thesis.
[13]
A.M. Brock, P. Truillet, B. Oriola, D. Picard, C. Jouffrais, Interactivity improves usability of geographic maps for visually impaired people, Human–Comput. Interact. 30 (2) (2015) 156–194.
[14]
D.-R. Chebat, S. Maidenbaum, A. Amedi, The transfer of non-visual spatial knowledge between real and virtual mazes via sensory substitution, Virtual Rehabilitation (ICVR), 2017 International Conference on, IEEE, 2017, pp. 1–7.
[15]
Y.-C. Cheng, Y. Chawathe, A. LaMarca, J. Krumm, Accuracy characterization for metropolitan-scale wi-fi localization, Proceedings of the 3rd International Conference on Mobile Systems, Applications, and Services, ACM, New York, NY, USA, 2005, pp. 233–245,.
[16]
A. Cobo, N.E. Guerrón, C. Martín, F. del Pozo, J.J. Serrano, Differences between blind people’s cognitive maps after proximity and distant exploration of virtual environments, Comput. Human Behav. 77 (2017) 294–308.
[17]
E.C. Connors, E.R. Chrastil, J. Sánchez, L.B. Merabet, Virtual environments for the transfer of navigation skills in the blind: a comparison of directed instruction vs. video game based learning approaches, Front. Human Neurosci. 8 (2014) 223.
[18]
Crawford S., 2019. Blind travelers can get around an airport more easily with new app. Retrieved November, 2019 from https://www.wired.com/story/challenge-helping-blind-people-navigate-indoors/.
[19]
F. De Felice, F. Renna, G. Attolico, A. Distante, Hapto-acoustic interaction metaphors in 3d virtual environments for non-visual settings, Virtual Reality, InTech, 2011.
[20]
M. Denis, Space and Spatial Cognition: A Multidisciplinary Perspective, Routledge, 2017.
[21]
M.B. Dias, A. Steinfeld, M.B. Dias, Future Directions in Indoor Navigation Technology for Blind Travelers, Boca Raton, FL: CRC Press, 2015.
[22]
F. van Diggelen, P. Enge, The world’s first gps mooc and worldwide laboratory using smartphones, Proceedings of the 28th International Technical Meeting of The Satellite Division of the Institute of Navigation (ION GNSS+ 2015), 2015, pp. 361–369.
[23]
J. Ducasse, A.M. Brock, C. Jouffrais, Accessible interactive maps for visually impaired users, Mobility of Visually Impaired People, Springer, 2018, pp. 537–584.
[24]
L. Evett, D. Brown, S. Battersby, A. Ridley, P. Smith, Accessible virtual environments for people who are blind–creating an intelligent virtual cane using the nintendo wii controller, The 7th International Conference on Virtual Rehabilitation (ICVDRAT), Maia, 2008.
[25]
N. Fallah, I. Apostolopoulos, K. Bekris, E. Folmer, The user as a sensor: navigating users with visual impairments in indoor spaces using tactile landmarks, Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, ACM, 2012, pp. 425–432.
[26]
N. Fallah, I. Apostolopoulos, K. Bekris, E. Folmer, Indoor human navigation systems: a survey, Interact. Comput. 25 (1) (2013) 21–33.
[27]
E.P. Fenech, F.A. Drews, J.Z. Bakdash, The effects of acoustic turn-by-turn navigation on wayfinding, Proc. Hum. Factors Ergon. Soc. Annu. Meet. 54 (23) (2010) 1926–1930,.
[28]
A. Fiannaca, I. Apostolopoulous, E. Folmer, Headlock: a wearable navigation aid that helps blind cane users traverse large open spaces, Proceedings of the 16th international ACM SIGACCESS conference on Computers & accessibility, ACM, 2014, pp. 19–26.
[29]
V. Filipe, F. Fernandes, H. Fernandes, A. Sousa, H. Paredes, J. Barroso, Blind navigation support system based on microsoft kinect, Proced. Comput. Sci. 14 (2012) 94–101.
[30]
D.J. Finnegan, E. O'Neill, M.J. Proulx, Compensating for distance compression in audiovisual virtual environments using incongruence, Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, 2016, pp. 200–212.
[31]
G. Flores, R. Manduchi, Easy Return: An App for Indoor Backtracking Assistance, Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, Montreal QC, Canada, 2018, pp. 1–12,.
[32]
G.H. Flores, R. Manduchi, A public transit assistant for blind bus passengers, IEEE Pervas. Comput. 17 (1) (2018) 49–59.
[33]
A. Ganz, J. Schafer, Y. Tao, Z. Yang, C. Sanderson, L. Haile, Percept navigation for visually impaired in large transportation hubs, J. Technol. Person. Disabilit. 6 (2018) 336–353.
[34]
S. Giraud, A.M. Brock, M.J.-M. Macé, C. Jouffrais, Map learning with a 3d printed interactive small-scale model: improvement of space and text memorization in visually impaired students, Front. Psychol. 8 (2017) 930.
[35]
N.A. Giudice, Navigating without vision: principles of blind spatial cognition, Handbook of Behavioral and Cognitive Geography, Edward Elgar Publishing, 2018.
[36]
N.A. Giudice, G.E. Legge, Blind Navigation and the Role of Technology, John Wiley & Sons, Inc., 2008.
[37]
J. Guerreiro, D. Ahmetovic, K.M. Kitani, C. Asakawa, Virtual navigation for blind people: Building sequential representations of the real-world, The proceedings of the 19th international ACM SIGACCESS conference on Computers and accessibility, ACM, 2017.
[38]
J. Guerreiro, D. Ahmetovic, D. Sato, K. Kitani, C. Asakawa, Airport accessibility and navigation assistance for people with visual impairments, Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, 2019.
[39]
J. Guerreiro, E. Ohn-Bar, D. Ahmetovic, K. Kitani, C. Asakawa, How context and user behavior affect indoor navigation assistance for blind people, Proceedings of W4A’18, 2018.
[40]
T. Guerreiro, K. Montague, J. Guerreiro, R. Nunes, H. Nicolau, D.J. Gonçalves, Blind people interacting with large touch surfaces: Strategies for one-handed and two-handed exploration, Proceedings of the 2015 International Conference on Interactive Tabletops & Surfaces, ACM, 2015, pp. 25–34.
[41]
N.E. Guerrón, A. Cobo, J.J. Serrano Olmedo, C. Martín, Sensitive interfaces for blind people in virtual visits inside unknown spaces, Inter. J Human-Computer Studies 133 (2020) 13–25,.
[42]
K. Hara, S. Azenkot, M. Campbell, C.L. Bennett, V. Le, S. Pannella, R. Moore, K. Minckler, R.H. Ng, J.E. Froehlich, Improving public transit accessibility for blind riders by crowdsourcing bus stop landmark locations with google street view: an extended analysis, ACM Trans. Accessible Comput. (TACCESS) 6 (2) (2015) 5.
[43]
J.F. Herman, T.G. Herman, S.P. Chatman, Constructing cognitive maps from partial information: a demonstration study with congenitally blind subjects, J. Visual Impairment Blind. 77 (5) (1983) 195–198.
[44]
HumanWare, 2019. Trekker breeze. Retrieved in June, 2019 from http://support.humanware.com/en-asia/support/breeze_support.
[45]
Iozzio, C., 2014. Indoor mapping lets the blind navigate airports.Retrieved July, 2018 from https://www.smithsonianmag.com/innovation/indoor-mapping-lets-blind-navigate-airports-180952292/.
[46]
T. Ishihara, J. Vongkulbhisal, K.M. Kitani, C. Asakawa, Beacon-guided structure from motion for smartphone-based navigation, Winter Conference on the Applications of Computer Vision, IEEE, 2017.
[47]
V.A. de Jesus Oliveira, L. Nedel, A. Maciel, L. Brayda, Anti-veering vibrotactile hmd for assistance of blind pedestrians, International Conference on Human Haptic Sensing and Touch Enabled Computer Applications, Springer, 2018, pp. 500–512.
[48]
H. Kacorri, S. Mascetti, A. Gerino, D. Ahmetovic, V. Alampi, H. Takagi, C. Asakawa, Insights on assistive orientation and mobility of people with visual impairment based on large-scale longitudinal data, ACM Trans. Access. Comput. 11 (1) (2018) 5:1–5:28,.
[49]
H. Kacorri, S. Mascetti, A. Gerino, D. Ahmetovic, H. Takagi, C. Asakawa, Supporting orientation of people with visual impairment: Analysis of large scale usage data, Proceedings of the 18th International ACM SIGACCESS Conference on Computers and Accessibility, ACM, 2016, pp. 151–159.
[50]
H. Kacorri, E. Ohn-Bar, K.M. Kitani, C. Asakawa, Environmental factors in indoor navigation based on real-world trajectories of blind users, Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, ACM, New York, NY, USA, 2018, pp. 56:1–56:12,.
[51]
S.K. Kane, M.R. Morris, A.Z. Perkins, D. Wigdor, R.E. Ladner, J.O. Wobbrock, Access overlays: improving non-visual access to large touch screens for blind users, Proceedings of the 24th annual ACM symposium on User interface software and technology, ACM, 2011, pp. 273–282.
[52]
B.F. Katz, S. Kammoun, G. Parseihian, O. Gutierrez, A. Brilhault, M. Auvray, P. Truillet, M. Denis, S. Thorpe, C. Jouffrais, Navig: augmented reality guidance system for the visually impaired, Virt. Real. 16 (4) (2012) 253–269.
[53]
J.-E. Kim, M. Bessho, S. Kobayashi, N. Koshizuka, K. Sakamura, Navigating visually impaired travelers in a large train station using smartphone and bluetooth low energy, Proceedings of the 31st Annual ACM Symposium on Applied Computing, ACM, 2016, pp. 604–611.
[54]
R. Kitchin, R. Jacobson, Techniques to collect and analyze the cognitive map knowledge of persons with visual impairment or blindness: issues of validity, J. Visual Impairm. Blind. 91 (4) (1997) 360–376.
[55]
R.L. Klatzky, J.M. Loomis, A.C. Beall, S.S. Chance, R.G. Golledge, Spatial updating of self-position and orientation during real, imagined, and virtual locomotion, Psychol. Sci. 9 (4) (1998) 293–298.
[56]
A.J. Kolarik, S. Cirstea, S. Pardhan, B.C. Moore, A summary of research investigating echolocation abilities of blind and sighted humans, Hear. Res. (2014).
[57]
J. Kreimeier, T. Götzelmann, First steps towards walk-in-place locomotion and haptic feedback in virtual reality for visually impaired, Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems, ACM, 2019, p. LBW2214.
[58]
A. Kunz, K. Miesenberger, L. Zeng, G. Weber, Virtual navigation environment for blind and low vision people, International Conference on Computers Helping People with Special Needs, Springer, 2018, pp. 114–122.
[59]
O. Lahav, H. Gedalevitz, S. Battersby, D. Brown, L. Evett, P. Merritt, Virtual environment navigation with look-around mode to explore new real spaces by people who are blind, Disabil. Rehabilit. 40 (9) (2018) 1072–1084.
[60]
O. Lahav, D. Mioduser, Haptic-feedback support for cognitive mapping of unknown spaces by people who are blind, Int. J. Human-Comput. Stud. 66 (1) (2008) 23–35.
[61]
O. Lahav, D.W. Schloerb, M.A. Srinivasan, Rehabilitation program integrating virtual environment to improve orientation and mobility skills for people who are blind, Comput. Educ. 80 (2015) 1–14.
[62]
A. Lécuyer, P. Mobuchon, C. Mégard, J. Perret, C. Andriot, J.-P. Colinot, Homere: a multimodal system for visually impaired people to explore virtual environments, IEEE Virtual Reality, 2003. Proceedings., IEEE, 2003, pp. 251–258.
[63]
F. Leo, E. Cocchi, L. Brayda, The effect of programmable tactile displays on spatial learning skills in children and adolescents of different visual disability, IEEE Trans. Neural Syst. Rehabilit. Eng. 25 (7) (2016) 861–872.
[64]
V.I. Levenshtein, Binary codes capable of correcting deletions, insertions, and reversals, Soviet physics doklady, 10, 1966, pp. 707–710.
[65]
B. Li, J.P. Munoz, X. Rong, J. Xiao, Y. Tian, A. Arditi, Isana: wearable context-aware indoor assistive navigation with obstacle avoidance for the blind, Proceedings of ECCV, Springer, 2016.
[66]
J.M. Loomis, R.G. Golledge, R.L. Klatzky, Navigation system for the blind: auditory display modes and guidance, Presence 7 (2) (1998) 193–203.
[67]
D.G. Luca, M. Alberto, Towards accurate indoor localization using ibeacons, fingerprinting and particle filtering, 2016 International Conference on Indoor Positioning and Indoor Navigation (IPIN), 2016.
[68]
S. Maidenbaum, S. Levy-Tzedek, D.-R. Chebat, A. Amedi, Increasing accessibility to the blind of virtual environments, using a virtual mobility aid based on the” eyecane”: feasibility study, PloS One 8 (8) (2013) e72555.
[69]
R. Manduchi, J.M. Coughlan, The last meter: Blind visual guidance to a target, Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, ACM, New York, NY, USA, 2014, pp. 3113–3122,.
[70]
S. Mascetti, D. Ahmetovic, A. Gerino, C. Bernareggi, Zebrarecognizer: pedestrian crossing recognition for people with visual impairment or blindness, Pattern Recog. (2016).
[71]
M. Miao, L. Zeng, G. Weber, Externalizing cognitive maps via map reconstruction and verbal description, Univ. Access Inf. Soc. 16 (3) (2017) 667–680.
[72]
S. Millar, Understanding and Representing Space: Theory and Evidence from Studies with Blind and Sighted Children., Clarendon Press/Oxford University Press, 1994.
[73]
A.D.B. Moldoveanu, S. Ivascu, I. Stanica, M.-I. Dascalu, R. Lupu, G. Ivanica, O. Balan, S. Caraiman, F. Ungureanu, F. Moldoveanu, et al., Mastering an advanced sensory substitution device for visually impaired through innovative virtual training, 2017 IEEE 7th International Conference on Consumer Electronics-Berlin (ICCE-Berlin), IEEE, 2017, pp. 120–125.
[74]
D.R. Montello, Spatial cognition, in: Smelser N.J., Baltes B. (Eds.), International Encyclopedia of the Social and Behavioral Sciences, 2001, pp. 7–14771.
[75]
M. Murata, D. Ahmetovic, D. Sato, H. Takagi, K.M. Kitani, C. Asakawa, Smartphone-based indoor localization for blind navigation across building complexes, Proceedings of the 16th IEEE International Conference on Pervasive Computing and Communcations, IEEE, 2018.
[76]
H. Nicolau, J. Jorge, T. Guerreiro, Blobby: how to guide a blind person, Proceedings of the CHI Conference Extended Abstracts on Human Factors in Computing Systems, ACM, 2009, pp. 3601–3606.
[77]
R. Nordahl, S. Serafin, N.C. Nilsson, L. Turchet, Enhancing realism in virtual environments by simulating the audio-haptic sensation of walking on ground surfaces, Virtual Reality Short Papers and Posters (VRW), 2012 IEEE, IEEE, 2012, pp. 73–74.
[78]
Y. Oh, W.-L. Kao, B.-C. Min, Indoor navigation aid system using no positioning technique for visually impaired people, International Conference on Human-Computer Interaction, Springer, 2017, pp. 390–397.
[79]
E. Ohn-Bar, J.a. Guerreiro, K. Kitani, C. Asakawa, Variability in reactions to instructional guidance during smartphone-based assisted navigation of blind users, Proc. ACM Interact. Mob. Wearable Ubiquit. Technol. 2 (3) (2018) 131:1–131:25,.
[80]
S.A. Panëels, A. Olmos, J.R. Blum, J.R. Cooperstock, Listen to it yourself!: evaluating usability of what’s around me? for the blind, Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, ACM, 2013, pp. 2107–2116.
[81]
K. Papadopoulos, E. Koustriava, M. Barouti, Cognitive maps of individuals with blindness for familiar and unfamiliar spaces: construction through audio-tactile maps and walked experience, Comput. Human Behav. 75 (2017) 376–384.
[82]
K. Papadopoulos, E. Koustriava, P. Koukourikos, Orientation and mobility aids for individuals with blindness: verbal description vs. audio-tactile map, Assist. Technol. (2017) 1–10.
[83]
R. Passini, G. Proulx, C. Rainville, The spatio-cognitive abilities of the visually impaired population, Env. Behav. 22 (1) (1990) 91–118.
[84]
E. Peng, P. Peursum, L. Li, S. Venkatesh, A smartphone-based obstacle sensor for the visually impaired, International Conference on Ubiquitous Intelligence and Computing, Springer, 2010, pp. 590–604.
[85]
J.E. Pérez, M. Arrue, M. Kobayashi, H. Takagi, C. Asakawa, Assessment of semantic taxonomies for blind indoor navigation based on a shopping center use case, Proceedings of the 14th Web for All Conference on The Future of Accessible Work, ACM, 2017, p. 19.
[86]
H. Petrie, V. Johnson, T. Strothotte, A. Raab, S. Fritz, R. Michel, Mobic: designing a travel aid for blind and elderly people, J. Navigat. 49 (1) (1996) 45–52.
[87]
L. Picinali, A. Afonso, M. Denis, B.F. Katz, Exploration of architectural spaces by blind people using auditory virtual reality for the construction of spatial knowledge, Int. J. Human-Comput. Stud. 72 (4) (2014) 393–407.
[88]
G.A. Radvansky, L.A. Carlson-Radvansky, D.E. Irwin, Uncertainty in estimating distances from memory, Memory Cognit. 23 (5) (1995) 596–606.
[89]
T.H. Riehle, S.M. Anderson, P.A. Lichter, W.E. Whalen, N.A. Giudice, Indoor inertial waypoint navigation for the blind, Engineering in Medicine and Biology Society (EMBC), 2013 35th Annual International Conference of the IEEE, IEEE, 2013, pp. 5187–5190.
[90]
J. Sánchez, A. Tadres, Audio and haptic based virtual environments for orientation and mobility in people who are blind, Proceedings of the 12th international ACM SIGACCESS conference on Computers and accessibility, ACM, 2010, pp. 237–238.
[91]
D. Sato, U. Oh, J. Guerreiro, D. Ahmetovic, K. Naito, H. Takagi, K.M. Kitani, C. Asakawa, NavCog3 in the Wild: Large-scale Blind Indoor Navigation Assistant with Semantic Features, ACM Transactions on Accessible Computing (TACCESS) 12 (3) (2019) 14.
[92]
J. Sauro, J.S. Dumas, Comparison of three one-question, post-task usability questionnaires, Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, ACM, 2009, pp. 1599–1608.
[93]
M.K. Scheuerman, W. Easley, A. Abdolrahmani, A. Hurst, S. Branham, Learning the language: the importance of studying written directions in designing navigational technologies for the blind, Proceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems, ACM, 2017, pp. 2922–2928.
[94]
Y. Seki, T. Sato, A training system of orientation and mobility for blind people using acoustic virtual reality, IEEE Trans. Neural syst. Rehabilit. Eng. 19 (1) (2010) 95–104.
[95]
L. Shangguan, Z. Yang, Z. Zhou, X. Zheng, C. Wu, Y. Liu, Crossnavi: enabling real-time crossroad navigation for the blind with commodity phones, Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing, ACM, 2014, pp. 787–798.
[96]
J. Su, A. Rosenzweig, A. Goel, E. de Lara, K.N. Truong, Timbremap: enabling the visually-impaired to use maps on touch-enabled devices, Proceedings of the 12th International Conference on Human Computer Interaction with Mobile Devices and Services, ACM, 2010, pp. 17–26.
[97]
B. Taylor, A. Dey, D. Siewiorek, A. Smailagic, Customizable 3d printed tactile maps as interactive overlays, Proceedings of the 18th International ACM SIGACCESS Conference on Computers and Accessibility, ACM, 2016, pp. 71–79.
[98]
C. Thinus-Blanc, F. Gaunet, Representation of space in blind persons: vision as a spatial sense?, Psychol. Bull. 121 (1) (1997) 20.
[99]
Y. Tian, X. Yang, C. Yi, A. Arditi, Toward a computer vision-based wayfinding aid for blind persons to access unfamiliar indoor environments, Mach. Vis. Appl. 24 (3) (2013) 521–535.
[100]
Wayfindr, 2017. Open standard for audio-based wayfinding: version 1.1.Retrieved June, 2019 from https://www.wayfindr.net/wp-content/uploads/2017/12/Wayfindr-Open-Standard-Rec-1.1.pdf.
[101]
W.R. Wiener, R.L. Welsh, B.B. Blasch, Foundations of Orientation and Mobility, 1, American Foundation for the Blind, 2010.
[102]
M.A. Williams, A. Hurst, S.K. Kane, Pray before you step out: describing personal and situational blind navigation behaviors, Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility, ACM, 2013, p. 28.
[103]
Wired, 2014. Wayfinder app helps the blind navigate the tube. Retrieved June, 2019 from http://www.wired.co.uk/article/wayfindr-app.
[104]
R. Yang, S. Park, S.R. Mishra, Z. Hong, C. Newsom, H. Joo, E. Hofer, M.W. Newman, Supporting spatial awareness and independent wayfinding for pedestrians with visual impairments, The Proceedings of the 13th International ACM SIGACCESS Conference on Computers and Accessibility, ACM, 2011, pp. 27–34.
[105]
K. Yatani, N. Banovic, K. Truong, Spacesense: representing geographical information to visually impaired people using spatial tactile feedback, Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, ACM, 2012, pp. 415–424.
[106]
H. Ye, M. Malu, U. Oh, L. Findlater, Current and future mobile and wearable device use by people with visual impairments, Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, ACM, 2014, pp. 3123–3132.
[107]
L. Zeng, M. Miao, G. Weber, Interactive audio-haptic map explorer on a tactile display, Int. Comput. 27 (4) (2014) 413–429.
[108]
L. Zeng, M. Simros, G. Weber, Camera-based mobile electronic travel aids support for cognitive mapping of unknown spaces, Proceedings of the 19th International Conference on Human-Computer Interaction with Mobile Devices and Services, ACM, 2017, p. 8.
[109]
Y. Zhao, C.L. Bennett, H. Benko, E. Cutrell, C. Holz, M.R. Morris, M. Sinclair, Enabling people with visual impairments to navigate virtual reality with a haptic and auditory cane simulation, Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, ACM, 2018.

Cited By

View all
  • (2024)EasyGO: A Field Study of Grocery Store Navigation Application Design for the Visually ImpairedCompanion Publication of the 2024 ACM Designing Interactive Systems Conference10.1145/3656156.3663719(214-218)Online publication date: 1-Jul-2024
  • (2024)Investigating Virtual Reality Locomotion Techniques with Blind PeopleProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642088(1-17)Online publication date: 11-May-2024
  • (2024)Exploring Audio Interfaces for Vertical Guidance in Augmented Reality via Hand-Based FeedbackIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2024.337204030:5(2818-2828)Online publication date: 4-Mar-2024
  • Show More Cited By

Index Terms

  1. Virtual navigation for blind people: Transferring route knowledge to the real-World
        Index terms have been assigned to the content through auto-classification.

        Recommendations

        Comments

        Please enable JavaScript to view thecomments powered by Disqus.

        Information & Contributors

        Information

        Published In

        cover image International Journal of Human-Computer Studies
        International Journal of Human-Computer Studies  Volume 135, Issue C
        Mar 2020
        119 pages

        Publisher

        Academic Press, Inc.

        United States

        Publication History

        Published: 01 March 2020

        Author Tags

        1. Virtual environment
        2. Indoor navigation
        3. Route knowledge
        4. Accessibility
        5. Assistive technologies
        6. Orientation and mobility
        7. Travel aids

        Qualifiers

        • Research-article

        Contributors

        Other Metrics

        Bibliometrics & Citations

        Bibliometrics

        Article Metrics

        • Downloads (Last 12 months)0
        • Downloads (Last 6 weeks)0
        Reflects downloads up to 01 Oct 2024

        Other Metrics

        Citations

        Cited By

        View all
        • (2024)EasyGO: A Field Study of Grocery Store Navigation Application Design for the Visually ImpairedCompanion Publication of the 2024 ACM Designing Interactive Systems Conference10.1145/3656156.3663719(214-218)Online publication date: 1-Jul-2024
        • (2024)Investigating Virtual Reality Locomotion Techniques with Blind PeopleProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642088(1-17)Online publication date: 11-May-2024
        • (2024)Exploring Audio Interfaces for Vertical Guidance in Augmented Reality via Hand-Based FeedbackIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2024.337204030:5(2818-2828)Online publication date: 4-Mar-2024
        • (2024)Empowering Orientation and Mobility Instructors: Digital Tools for Enhancing Navigation Skills in People Who Are BlindComputers Helping People with Special Needs10.1007/978-3-031-62846-7_52(436-443)Online publication date: 8-Jul-2024
        • (2024)On the Use of a Simulation Framework for Studying Accessibility Challenges Faced by People with Disabilities in Indoor EnvironmentsComputers Helping People with Special Needs10.1007/978-3-031-62846-7_4(31-37)Online publication date: 8-Jul-2024
        • (2023)Spatial Multimodal Alert Cues for Virtual Reality Control Environments in Industry 4.0Proceedings of the 2nd International Conference of the ACM Greek SIGCHI Chapter10.1145/3609987.3610003(1-6)Online publication date: 27-Sep-2023
        • (2023)Opportunities for Accessible Virtual Reality Design for Immersive Musical Performances for Blind and Low-Vision PeopleProceedings of the 2023 ACM Symposium on Spatial User Interaction10.1145/3607822.3614540(1-21)Online publication date: 13-Oct-2023
        • (2023)"I Want to Figure Things Out": Supporting Exploration in Navigation for People with Visual ImpairmentsProceedings of the ACM on Human-Computer Interaction10.1145/35794967:CSCW1(1-28)Online publication date: 16-Apr-2023
        • (2023)Inclusive Social Virtual Environments: Exploring the Acceptability of Different Navigation and Awareness TechniquesExtended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544549.3585700(1-7)Online publication date: 19-Apr-2023
        • (2023)3D Building Plans: Supporting Navigation by People who are Blind or have Low Vision in Multi-Storey BuildingsProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581389(1-19)Online publication date: 19-Apr-2023
        • Show More Cited By

        View Options

        View options

        Get Access

        Login options

        Media

        Figures

        Other

        Tables

        Share

        Share

        Share this Publication link

        Share on social media