Nothing Special   »   [go: up one dir, main page]

skip to main content
research-article

Virtual Paving: Rendering a Smooth Path for People with Visual Impairment through Vibrotactile and Audio Feedback

Published: 04 September 2020 Publication History

Abstract

Tactile pavings are public works for visually impaired people, designed to indicate a particular path to follow by providing haptic cues underfoot. However, they face many limitations such as installation errors, obstructions, degradation, and limited coverage. To address these issues, we propose Virtual Paving, which aims to assist independent navigation by rendering a smooth path to visually impaired people through multi-modal feedback. This work assumes that a path has been planned to avoid obstacles and focuses on the feedback design to guide users along the path safely, smoothly, and efficiently. Firstly, we extracted the design guidelines of Virtual Paving based on an investigation into visually impaired people's current practices and issues with tactile pavings. Next, we developed a multi-modal solution through co-design and evaluation with visually impaired users. This solution included (1) vibrotactile feedback on the shoulders and waist to give readily-perceivable directional cues and (2) audio feedback to describe road conditions ahead of the user. Finally, we evaluated the proposed solution through user tests. Guided by the designed feedback, 16 visually impaired participants successfully completed 127 out of 128 trials with 2.1m-wide basic paths, including straight and curved paths. Subjective feedback indicated that our solution to render Virtual Paving was easy for users to learn, and it also enabled them to walk smoothly. The feasibility and potential limitations for Virtual Paving to support independent navigation in real environments are discussed.

Supplementary Material

xu (xu.zip)
Supplemental movie, appendix, image and software files for, Virtual Paving: Rendering a Smooth Path for People with Visual Impairment through Vibrotactile and Audio Feedback

References

[1]
Daniel Aguerrevere, Maroof Choudhury, and Armando Barreto. 2004. Portable 3D sound/sonar navigation system for blind individuals. In 2nd LACCEIInt. Latin Amer. Caribbean Conf. Eng. Technol. Miami, FL.
[2]
Dragan Ahmetovic, Cole Gleason, Chengxiong Ruan, Kris M Kitani, Hironobu Takagi, and Chieko Asakawa. 2016. NavCog: a navigational cognitive assistant for the blind. (2016), 90--99.
[3]
Shiri Azenkot, Richard E Ladner, and Jacob O Wobbrock. 2011. Smartphone haptic feedback for nonvisual wayfinding. In The proceedings of the 13th international ACM SIGACCESS conference on Computers and accessibility. 281--282.
[4]
Shiri Azenkot, Sanjana Prasain, Alan Borning, Emily Fortuna, Richard E Ladner, and Jacob O Wobbrock. 2011. Enhancing independence and safety for blind and deaf-blind public transit riders. In Proceedings of the SIGCHI conference on Human Factors in computing systems. ACM, 3247--3256.
[5]
Cynthia L Bennett, Erin Brady, and Stacy M Branham. 2018. Interdependence as a frame for assistive technology research and design. In Proceedings of the 20th International ACM SIGACCESS Conference on Computers and Accessibility. 161--173.
[6]
Jeffrey R Blum, Mathieu Bouchard, and Jeremy R Cooperstock. 2011. What's around me? Spatialized audio augmented reality for blind users with a smartphone. In International Conference on Mobile and Ubiquitous Systems: Computing, Networking, and Services. Springer, 49--62.
[7]
M Bouzit, A Chaibi, KJ De Laurentis, and C Mavroidis. 2004. Tactile feedback navigation handle for the visually impaired. In ASME 2004 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers Digital Collection, 1171--1177.
[8]
Sylvain Cardin, Daniel Thalmann, and Frédéric Vexo. 2007. A wearable system for mobility improvement of visually impaired people. The Visual Computer 23, 2 (2007), 109--118.
[9]
DD Clark-Carter, AD Heyes, and CI Howarth. 1986. The efficiency and walking speed of visually impaired people. Ergonomics 29, 6 (1986), 779--789.
[10]
Akansel Cosgun, E Akin Sisbot, and Henrik I Christensen. 2014. Guidance for human navigation using a vibro-tactile belt interface and robot-like motion planning. In 2014 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 6350--6355.
[11]
Dimitrios Dakopoulos, Sanjay K Boddhu, and Nikolaos Bourbakis. 2007. A 2D vibration array as an assistive device for visually impaired. In 2007 IEEE 7th International Symposium on BioInformatics and BioEngineering. IEEE, 930--937.
[12]
Dimitrios Dakopoulos and Nikolaos G Bourbakis. 2009. Wearable obstacle avoidance electronic travel aids for blind: a survey. IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews) 40, 1 (2009), 25--35.
[13]
Sevgi Ertan, Clare Lee, Abigail Willets, Hong Tan, and Alex Pentland. 1998. A wearable haptic navigation guidance system. In Digest of Papers. Second International Symposium on Wearable Computers (Cat. No. 98EX215). IEEE, 164--165.
[14]
Hugo Fernandes, Paulo Costa, Vitor Filipe, Hugo Paredes, and João Barroso. 2019. A review of assistive spatial orientation and navigation technologies for the visually impaired. Universal Access in the Information Society 18, 1 (2019), 155--168.
[15]
German Flores, Sri Kurniawan, Roberto Manduchi, Eric Martinson, Lourdes M Morales, and Emrah Akin Sisbot. 2015. Vibrotactile guidance for wayfinding of blind walkers. IEEE transactions on haptics 8, 3 (2015), 306--317.
[16]
Florence Gaunet. 2006. Verbal guidance rules for a localized wayfinding aid intended for blind-pedestrians in urban areas. Universal Access in the Information Society 4, 4 (2006), 338--353.
[17]
Florence Gaunet and Xavier Briffault. 2005. Exploring the functional specifications of a localized wayfinding verbal aid for blind pedestrians: Simple and structured urban areas. Human-Computer Interaction 20, 3 (2005), 267--314.
[18]
Monica Gori, Giulia Cappagli, Alessia Tonelli, Gabriel Baudbovy, and Sara Finocchietti. 2016. Devices for visually impaired people: High technological devices with low user acceptance and no adaptability for children. Neuroscience and Biobehavioral Reviews 69 (2016), 79--88.
[19]
David Guth. 2007. Why does training reduce blind pedestrians veering. Blindness and brain plasticity in navigation and object perception (2007), 353--365.
[20]
Kaiming He, Georgia Gkioxari, Piotr Dollár, and Ross Girshick. 2017. Mask r-cnn. In Proceedings of the IEEE international conference on computer vision. 2961--2969.
[21]
Andreas Hub, Joachim Diepstraten, and Thomas Ertl. 2004. Design and development of an indoor navigation and object identification system for the blind. In ACM Sigaccess Accessibility and Computing. ACM, 147--152.
[22]
99% Invisible. 2017. Death by Tactile Paving: China's Precarious Paths for the Visually Impaired. article. https://99percentinvisible.org/article/death-tactile-paving-chinas-precarious-paths-visually-impaired
[23]
Kiyohide Ito, Makoto Okamoto, Junichi Akita, Tetsuo Ono, Ikuko Gyobu, Tomohito Takagi, Takahiro Hoshi, and Yu Mishima. 2005. CyARM: an alternative aid device for blind persons. In CHI'05 Extended Abstracts on Human Factors in Computing Systems. ACM, 1483--1488.
[24]
Lise A Johnson and Charles M Higgins. 2006. A navigation aid for the blind using tactile-visual sensory substitution. In 2006 International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE, 6289--6292.
[25]
Hernisa Kacorri, Eshed Ohn-Bar, Kris M Kitani, and Chieko Asakawa. 2018. Environmental factors in indoor navigation based on real-world trajectories of blind users. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. ACM, 56.
[26]
Slava Kalyuga, Paul Chandler, and John Sweller. 1999. Managing split-attention and redundancy in multimedia instruction. Applied Cognitive Psychology 13, 4 (1999), 351--371.
[27]
F Kamil, S Tang, W Khaksar, N Zulkifli, and SA Ahmad. 2015. A review on motion planning and obstacle avoidance approaches in dynamic environments. Advances in Robotics & Automation 4, 2 (2015), 134--142.
[28]
Idin Karuei, Karon E MacLean, Zoltan Foley-Fisher, Russell MacKenzie, Sebastian Koch, and Mohamed El-Zohairy. 2011. Detecting vibrations across the body in mobile contexts. In Proceedings of the SIGCHI conference on Human factors in computing systems. ACM, 3267--3276.
[29]
Brian FG Katz, Slim Kammoun, Gaëtan Parseihian, Olivier Gutierrez, Adrien Brilhault, Malika Auvray, Philippe Truillet, Michel Denis, Simon Thorpe, and Christophe Jouffrais. 2012. NAVIG: augmented reality guidance system for the visually impaired. Virtual Reality 16, 4 (2012), 253--269.
[30]
Robert K Katzschmann, Brandon Araki, and Daniela Rus. 2018. Safe local navigation for visually impaired users with a time-of-flight and haptic feedback device. IEEE Transactions on Neural Systems and Rehabilitation Engineering 26, 3 (2018), 583--593.
[31]
L Kay. 1974. A sonar aid to enhance spatial perception of the blind: engineering design and evaluation. Radio and Electronic Engineer 44, 11 (1974), 605--627.
[32]
Yoshiyuki Kobayashi, Takamichi Takashima, Mieko Hayashi, and Hiroshi Fujimoto. 2005. Gait analysis of people walking on tactile ground surface indicators. IEEE Transactions on neural systems and rehabilitation engineering 13, 1 (2005), 53--59.
[33]
Young Hoon Lee and Gerard Medioni. 2014. Wearable RGBD indoor navigation system for the blind. In European Conference on Computer Vision. Springer, 493--508.
[34]
Hong Liu, Jun Wang, Xiangdong Wang, and Yueliang Qian. 2015. iSee: obstacle detection and feedback system for the blind. In Adjunct Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2015 ACM International Symposium on Wearable Computers. ACM, 197--200.
[35]
Jack M Loomis, Reginald G Golledge, and Roberta L Klatzky. 1998. Navigation system for the blind: Auditory display modes and guidance. Presence 7, 2 (1998), 193--203.
[36]
Jiangyan Lu, Kin Wai Michael Siu, and Ping Xu. 2008. A comparative study of tactile paving design standards in different countries. In 2008 9th International Conference on Computer-Aided Industrial Design and Conceptual Design. IEEE, 753--758.
[37]
Shachar Maidenbaum, Shlomi Hanassy, Sami Abboud, Galit Buchs, Daniel-Robert Chebat, Shelly Levy-Tzedek, and Amir Amedi. 2014. The "EyeCane", a new electronic travel aid for the blind: Technology, behavior & swift learning. Restorative neurology and neuroscience 32, 6 (2014), 813--824.
[38]
Roberto Manduchi and James Coughlan. 2012. (Computer) vision without sight. Commun. ACM 55, 1 (2012), 96.
[39]
James R Marston, Jack M Loomis, Roberta L Klatzky, and Reginald G Golledge. 2007. Nonvisual route following with guidance from a simple haptic or auditory display. Journal of Visual Impairment & Blindness 101, 4 (2007), 203--211.
[40]
James R Marston, Jack M Loomis, Roberta L Klatzky, Reginald G Golledge, and Ethan L Smith. 2006. Evaluation of spatial displays for navigation without sight. ACM Transactions on Applied Perception (TAP) 3, 2 (2006), 110--124.
[41]
Simon Meers and Koren Ward. 2005. A substitute vision system for providing 3D perception and GPS navigation via electro-tactile stimulation. (2005).
[42]
Anita Meier, Denys JC Matthies, Bodo Urban, and Reto Wettach. 2015. Exploring vibrotactile feedback on the body and foot for the purpose of pedestrian navigation. In Proceedings of the 2nd international Workshop on Sensor-based Activity Recognition and Interaction. ACM, 11.
[43]
Peter BL Meijer. 1992. An experimental system for auditory image representations. IEEE transactions on biomedical engineering 39, 2 (1992), 112--121.
[44]
Tomomi Mizuno, Arisa Nishidate, Katsumi Tokuda, and ARAI Kunijiro. 2008. Installation errors and corrections in tactile ground surface indicators in Europe, America, Oceania and Asia. IATSS research 32, 2 (2008), 68--80.
[45]
Dejing Ni, Lu Wang, Yu Ding, Jun Zhang, Aiguo Song, and Juan Wu. 2013. The design and implementation of a walking assistant system with vibrotactile indication and voice prompt for the visually impaired. In 2013 IEEE International Conference on Robotics and Biomimetics (ROBIO). IEEE, 2721--2726.
[46]
Ministry of Housing and Urban-Rural Development of the People's Republic of China. 2019. Standards for architectural design of special education schools (JGJ76-2019). Document. http://www.mohurd.gov.cn/wjfb/202004/t20200413_244931.html
[47]
Eshed Ohn-Bar, João Guerreiro, Kris Kitani, and Chieko Asakawa. 2018. Variability in reactions to instructional guidance during smartphone-based assisted navigation of blind users. Proceedings of the ACM on interactive, mobile, wearable and ubiquitous technologies 2, 3 (2018), 1--25.
[48]
Eshed Ohn-Bar, Kris Kitani, and Chieko Asakawa. 2018. Personalized dynamics models for adaptive assistive navigation systems. arXiv preprint arXiv:1804.04118 (2018).
[49]
Sabrina A Paneels, Dylan Varenne, Jeffrey R Blum, and Jeremy R Cooperstock. 2013. The walking straight mobile application: Helping the visually impaired avoid veering. Georgia Institute of Technology.
[50]
Nanda Pluijter, Lieke PW de Wit, Sjoerd M Bruijn, and Myrthe A Plaisier. 2015. Tactile pavement for guiding walking direction: An assessment of heading direction and gait stability. Gait & posture 42, 4 (2015), 534--538.
[51]
Leah Reeves, Jennifer Lai, James A Larson, Sharon Oviatt, T S Balaji, Stephanie Buisine, Penny Collings, Phil Cohen, Ben J Kraal, Jeanclaude Martin, et al. 2004. Guidelines for multimodal user interface design. Communications of The ACM 47, 1 (2004), 57--59.
[52]
Shaoqing Ren, Kaiming He, Ross Girshick, and Jian Sun. 2015. Faster r-cnn: Towards real-time object detection with region proposal networks. In Advances in neural information processing systems. 91--99.
[53]
Abbas Riazi, Fatemeh Riazi, Rezvan Yoosfi, and Fatemeh Bahmeei. 2016. Outdoor difficulties experienced by a group of visually impaired Iranian people. Journal of current ophthalmology 28, 2 (2016), 85--90.
[54]
Timm Rosburg. 2008. Tactile ground surface indicators in public places.
[55]
Anne Spencer Ross, Edward Cutrell, Alex Fiannaca, Melanie Kneisel, and Meredith Ringle Morris. [n.d.]. Use Cases and Impact of Audio-Based Virtual Exploration. In CHI 2019 Workshop on Hacking Blind Navigation.
[56]
Daisuke Sato, Uran Oh, Kakuya Naito, Hironobu Takagi, Kris Kitani, and Chieko Asakawa. 2017. NavCog3: An evaluation of a smartphone-based blind indoor navigation assistant with semantic features in a large-scale environment. In Proceedings of the 19th International ACM SIGACCESS Conference on Computers and Accessibility. 270--279.
[57]
Maximilian Schirmer, Johannes Hartmann, Sven Bertel, and Florian Echtler. 2015. Shoe me the way: a shoe-based tactile interface for eyes-free urban navigation. In Proceedings of the 17th International Conference on Human-Computer Interaction with Mobile Devices and Services. ACM, 327--336.
[58]
Shraga Shoval, Johann Borenstein, and Yoram Koren. 1994. Mobile robot obstacle avoidance in a computerized travel aid for the blind. In Proceedings of the 1994 IEEE International Conference on Robotics and Automation. IEEE, 2023--2028.
[59]
Kin Wai Michael Siu. 2013. Design standard for inclusion: tactile ground surface indicators in China. Facilities (2013).
[60]
Adam Spiers, Aaron M Dollar, Janet Van Der Linden, and Maria Oshodi. 2015. First validation of the Haptic Sandwich: A shape changing handheld haptic navigation aid. (2015), 144--151.
[61]
Mayuree Srikulwong and Eamonn O'Neill. 2010. A direct experimental comparison of back array and waist-belt tactile interfaces for indicating direction. In Workshop on Multimodal Location Based Techniques for Extreme Navigation at Pervasive. 5--8.
[62]
Mayuree Srikulwong and Eamonn O'Neill. 2011. A comparative study of tactile representation techniques for landmarks on a wearable device. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 2029--2038.
[63]
Agneta Ståhl, Mai Almén, and Hinderfri Design AB. 2007. How do blind people orient themselves along a continu-ous guidance route?
[64]
Yuichiro Takeuchi. 2010. Gilded gait: reshaping the urban experience with augmented footsteps. In Proceedings of the 23nd annual ACM symposium on User interface software and technology. ACM, 185--188.
[65]
Hong Tan, Robert Gray, J Jay Young, and Ryan Taylor. 2003. A haptic back display for attentional and directional cueing. (2003).
[66]
B. S. Tjan, P. J. Beckmann, R. Roy, N. Giudice, and G. E. Legge. 2005. Digital Sign System for Indoor Wayfinding for the Visually Impaired. In Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Workshops - Volume 03 (CVPR '05). IEEE Computer Society, USA, 30. https://doi.org/10.1109/CVPR.2005.442
[67]
Koji Tsukada and Michiaki Yasumura. 2004. Activebelt: Belt-type wearable tactile display for directional navigation. In International Conference on Ubiquitous Computing. Springer, 384--399.
[68]
Iwan Ulrich and Johann Borenstein. 2001. The GuideCane-applying mobile robot technologies to assist the visually impaired. IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans 31, 2 (2001), 131--136.
[69]
Ramiro Velázquez. 2010. Wearable assistive devices for the blind. In Wearable and autonomous biomedical devices and systems for smart environment. Springer, 331--349.
[70]
Ramiro Velázquez, Omar Bazán, Claudia Alonso, and Carlos Delgado-Mata. 2011. Vibrating insoles for tactile communication with the feet. In 2011 15th International Conference on Advanced Robotics (ICAR). IEEE, 118--123.
[71]
Ramiro Velãzquez, Omar Bazán, and Marco Magaña. 2009. A shoe-integrated tactile display for directional navigation. In 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE, 1235--1240.
[72]
Andreas Wachaja, Pratik Agarwal, Mathias Zink, Miguel Reyes Adame, Knut Möller, and Wolfram Burgard. 2015. Navigating blind people with a smart walker. In 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 6014--6019.
[73]
Bruce N Walker and Jeffrey Lindsay. 2005. Navigation performance in a virtual environment with bonephones. Georgia Institute of Technology.
[74]
Hsueh-Cheng Wang, Robert K Katzschmann, Santani Teng, Brandon Araki, Laura Giarré, and Daniela Rus. 2017. Enabling independent navigation for visually impaired people through a wearable vision-based feedback system. In 2017 IEEE international conference on robotics and automation (ICRA). IEEE, 6533--6540.
[75]
WHO. 2012. Global data on visual impairment 2010. Document. https://www.who.int/blindness/GLOBALDATAFINALforweb.pdf
[76]
Michele A Williams, Caroline Galbraith, Shaun K Kane, and Amy Hurst. 2014. Just let the cane hit it: how the blind and sighted see navigation differently. In Proceedings of the 16th international ACM SIGACCESS conference on Computers & accessibility. ACM, 217--224.
[77]
Michele A Williams, Amy Hurst, and Shaun K Kane. 2013. Pray before you step out: describing personal and situational blind navigation behaviors. In Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility. ACM, 28.
[78]
Kailun Yang, Luis M Bergasa, Eduardo Romera, and Kaiwei Wang. 2019. Robustifying semantic cognition of traversability across wearable RGB-depth cameras. Applied optics 58, 12 (2019), 3141--3155.
[79]
Limin Zeng. 2015. A survey: outdoor mobility experiences by the visually impaired. Mensch und Computer 2015-Workshopband (2015).
[80]
Kaichun Zhou, Chengfeng Hu, Honghui Zhang, Yulong Hu, and Binggeng Xie. 2019. Why do we hardly see people with visual impairments in the street? A case study of Changsha, China. Applied Geography 110 (2019), 102043.
[81]
Susanne Zimmermann-Janschitz, Bettina Mandl, and Antonia Dückelmann. 2017. Clustering the Mobility Needs of Persons with Visual Impairment or Legal Blindness. Transportation Research Record 2650, 1 (2017), 66--73.

Cited By

View all
  • (2024)WatchCap: Improving Scanning Efficiency in People with Low Vision through Compensatory Head Movement StimulationProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36595928:2(1-32)Online publication date: 15-May-2024
  • (2023)Enhancing Wayfinding Experience in Low-Vision Individuals through a Tailored Mobile Guidance InterfaceElectronics10.3390/electronics1222456112:22(4561)Online publication date: 7-Nov-2023
  • (2023)MobiEye: An Efficient Shopping-Assistance System for the Visually Impaired With Mobile Phone SensingIEEE Transactions on Human-Machine Systems10.1109/THMS.2023.330556653:5(865-874)Online publication date: Oct-2023
  • Show More Cited By

Index Terms

  1. Virtual Paving: Rendering a Smooth Path for People with Visual Impairment through Vibrotactile and Audio Feedback

        Recommendations

        Comments

        Please enable JavaScript to view thecomments powered by Disqus.

        Information & Contributors

        Information

        Published In

        cover image Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies
        Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies  Volume 4, Issue 3
        September 2020
        1061 pages
        EISSN:2474-9567
        DOI:10.1145/3422862
        Issue’s Table of Contents
        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        Published: 04 September 2020
        Published in IMWUT Volume 4, Issue 3

        Permissions

        Request permissions for this article.

        Check for updates

        Author Tags

        1. accessibility
        2. auditory feedback
        3. path following
        4. tactile paving
        5. vibrotactile feedback
        6. visual impairment

        Qualifiers

        • Research-article
        • Research
        • Refereed

        Contributors

        Other Metrics

        Bibliometrics & Citations

        Bibliometrics

        Article Metrics

        • Downloads (Last 12 months)141
        • Downloads (Last 6 weeks)14
        Reflects downloads up to 01 Oct 2024

        Other Metrics

        Citations

        Cited By

        View all
        • (2024)WatchCap: Improving Scanning Efficiency in People with Low Vision through Compensatory Head Movement StimulationProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36595928:2(1-32)Online publication date: 15-May-2024
        • (2023)Enhancing Wayfinding Experience in Low-Vision Individuals through a Tailored Mobile Guidance InterfaceElectronics10.3390/electronics1222456112:22(4561)Online publication date: 7-Nov-2023
        • (2023)MobiEye: An Efficient Shopping-Assistance System for the Visually Impaired With Mobile Phone SensingIEEE Transactions on Human-Machine Systems10.1109/THMS.2023.330556653:5(865-874)Online publication date: Oct-2023
        • (2022)Human-Machine Cooperative Echolocation Using UltrasoundIEEE Access10.1109/ACCESS.2022.322446810(125264-125278)Online publication date: 2022
        • (2021)The practice of applying AI to benefit visually impaired people in ChinaCommunications of the ACM10.1145/348162364:11(70-75)Online publication date: 25-Oct-2021
        • (2021)LightGuideProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/34635245:2(1-27)Online publication date: 24-Jun-2021
        • (2021)Tactile Compass: Enabling Visually Impaired People to Follow a Path with Continuous Directional FeedbackProceedings of the 2021 CHI Conference on Human Factors in Computing Systems10.1145/3411764.3445644(1-13)Online publication date: 6-May-2021
        • (2021)Auth+Track: Enabling Authentication Free Interaction on Smartphone by Continuous User TrackingProceedings of the 2021 CHI Conference on Human Factors in Computing Systems10.1145/3411764.3445624(1-16)Online publication date: 6-May-2021
        • (2020)HealthWalksProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/34322294:4(1-26)Online publication date: 17-Dec-2020
        • (2020)HeadCrossProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/33809834:1(1-22)Online publication date: 14-Sep-2020
        • Show More Cited By

        View Options

        Get Access

        Login options

        Full Access

        View options

        PDF

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader

        Media

        Figures

        Other

        Tables

        Share

        Share

        Share this Publication link

        Share on social media