Abstract
In this paper, we present a design of a wearable equipment that helps with the perception of the environment for visually impaired people in both indoor and outdoor mobility and navigation. Our prototype can detect and identify traffic situations such as street crossings, traffic lamps, cars, cyclists, other people and obstacles hanging down from above or placed on the ground. The detection takes place in real time based on input data of sensors and cameras, the mobility of the user is aided with audio signals.
Similar content being viewed by others
References
ARGUS Consortium (2012–2013) Assisting personal guidance system for people with visual impairment. http://www.projectargus.eu/. Accessed 28 May 2014
Balakrishnan G, Sainarayanan G, Nagarajan R, Yaacob S (2007) Wearable real-time stereo vision for the visually impaired. Eng Lett 14(2):6–14
CASBliP (2009) Cognitive aid system for blind people. http://www.casblip.com/. Accessed 28 May 2014
Davies ER (2012) Computer and Machine Vision: Theory, Algorithms, Practicalities, 4th edn. Elsevier, Amsterdam (ISBN 978-0-12-386908-1)
Ding B, Yuan H, Jiang L, Zang X (2007) The research on blind navigation system based on RFID. In: International conference on wireless communications, networking and mobile computing (WiCom’07), pp 2058–2061
Gude R, Østerby M, Soltveit S (2008) Blind navigation and object recognition. Tech. rep., University of Aarhus, Denmark
Gustafson-Pearce O, Billett E, Cecelja F (2005) Tugs—the tactile user guidance system. Tech. rep., Brunel University, UK
van der Heijden F, Regtien P (2005) Wearable navigation assistance—a tool for the blind. Meas Sci Rev 5(2):53–56
Hub A, Diepstraten J, Ertl T (2005) Augmented indoor modeling for navigation support for the blind. In: CPSN, pp 54–62
Kanna V, Prasad PG, Amirtharaj S, Prabhu N, Anderson C (2011) Design of a fpga based virtual eye for the blind. In: 2nd international conference on environmental science and technology, vol 2, pp 198–202
Koskinen S, Virtanen A (2004) Navigation system for the visually impaired based on an information server concept. In: Mobile venue. http://virtual.vtt.fi/virtual/noppa/mvenue/navigationsystemforthevisuallyimpairedbasedonaninformation.pdf
Kulyukin V, Gharpure C, Sute P, De N, Nicholson GJ (2004) A robotic wayfinding system for the visually impaired. In: Proceedings of the sixteenth innovative applications of artificial intelligence conference (IAAI’04). AAAI/MIT Press, Cambridge, pp 864–869
Lacey G, Rodríguez-Losada D (2008) The evolution of guido. IEEE Robot Autom Mag 15(4):75–83
Loomis JM, Golledge RG, Klatzky RL (1998) Navigation system for the blind: auditory display modes and guidance. Presence 7(2):193–203
Loomis JM, Golledge RG, Klatzky RL (2001) Fundamentals of Wearable Computers and Augmented Reality, Chap. GPS-Based Navigation Systems for the Visually Impaired, 13. Lawrence Erlbaum Associates, Mahwah, pp 429–438
Pressl B, Wieser M (2006) A computer-based navigation system tailored to the needs of blind people. In: Miesenberger K, Klaus J, Zagler WL, Karshmer AI (eds) ICCHP 2006. LNCS, vol 4061, pp 1280–1286
Ran L, Helal S, Moore S (2004) Drishti: an integrated indoor/outdoor blind navigation system and service. In: Second IEEE international conference on pervasive computing and communications (PerCom’04), pp 23–30
Rao SK, Prasad AB, Shetty AR, Bhakthavathsalam CR, Hegde R (2012) Stereo acoustic perception based on real time video acquisition for navigational assistance. CoRR. arXiv:1208.1880
Acknowledgments
The development of this prototype was supported by Magyar Telekom and Gaia Software Ltd. Publishing of this work were funded by the European Union and co-financed by the European Social Fund under the Grant No. TÁMOP-4.2.2.D-15/1/KONV-2015-0006 (MedicNetwork).
Author information
Authors and Affiliations
Corresponding author
Additional information
This article is an extended version of the paper that appeared at CogInfocom 2014.
Rights and permissions
About this article
Cite this article
Sövény, B., Kovács, G. & Kardkovács, Z.T. Blind guide. J Multimodal User Interfaces 9, 287–297 (2015). https://doi.org/10.1007/s12193-015-0191-6
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12193-015-0191-6