Abstract
Access to non-verbal cues in social interactions is vital for people with visual impairment. It has been shown that non-verbal cues such as eye contact, number of people, their names and positions are helpful for individuals who are blind. In this paper, we present a real time multi-modal system that provides such non-verbal cues via audio and haptic interfaces. We assess the usefulness of the proposed system both quantitatively and qualitatively by gathering feedback with a focus group of visually impaired participants in a typical social interaction setting. The paper provides important insight about developing such technology for this significant part of society.
Similar content being viewed by others
References
Ahmed T, Hoyle R, Connelly K, Crandall D, Kapadia A (2015) Privacy Concerns and Behaviors of People With Visual Impairments. In: Proceedings of CHI. ACM, pp 3523–3532
Alletto S, Serra G, Calderara S, Solera F, Cucchiara R (2014) From Ego to Nos-Vision: Detecting Social Relationships in First-Person Views. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops, pp 580–585
Astler D, Chau H, Hsu K, Hua A, Kannan A, Lei L, Nathanson M, Paryavi E, Rosen M, Unno H et al. (2011) Increased Accessibility to Nonverbal Communication Through Facial and Expression Recognition Technologies for Blind/Visually Impaired Subjects. In: Proceedings ACM SIGACCESS Conference on Computers and accessibility. ACM, pp. 259–260
Astler D, Chau H, Hsu K, Hua A, Kannan A, Lei L, Nathanson M, Paryavi E, Rosen M (2012) Facial and Expression Recognition for the Blind Using Computer Vision. Technical Report. University of Maryland
Ba SO, Odobez J-M (2009) Recognizing visual focus of attention from head pose in natural meetings. IEEE T Sys Man Cy B 39(1):16–33
Brady E, Morris MR, Zhong Y, White S, Bigham JP (2013) Visual Challenges in the Everyday Lives of Blind People. In: Proceedings of CHI. ACM, pp 2117–2126
Chaudhry S, Chandra R (2015) Design of a Mobile Face Recognition System for Visually Impaired Persons. arXiv preprint arXiv:1502.00756
Davis FD (1989) Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Quart 319–340
Fiannaca A, Apostolopoulous I, Folmer E (2014) Headlock: A Wearable Navigation Aid That Helps Blind Cane Users Traverse Large Open Spaces. In: Proceedings of the 16th International ACM SIGACCESS Conference on Computers & Accessibility. ACM, pp 19–26
Kramer KM, Hedin DS, Rolkosky DJ (2010) Smartphone Based Face Recognition Tool for the Blind. In: Annual International Conference of the IEEE Engineering in Medicine and Biology. IEEE, pp 4538–4541
Krishna S, Colbry D, Black J, Balasubramanian V, Panchanathan S (2008) A Systematic Requirements Analysis and Development of an Assistive Device to Enhance the Social Interaction of People Who Are Blind or Visually Impaired. In: Workshop on Computer Vision Applications for the Visually Impaired
Kazemi V, Sullivan J (2014) One Millisecond Face Alignment with an Ensemble of Regression Trees. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp 1867–1874
Matsushita Y, Ofek E, Tang X, Shum H-Y (2005) Full-Frame Video Stabilization. In: IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR), vol 1. IEEE, pp 50–57
McDaniel T, Krishna S, Balasubramanian V, Colbry D, Panchanathan S (2008) Using a Haptic Belt to Convey Non-Verbal Communication Cues During Social Interactions to Individuals Who Are Blind. In: IEEE International Workshop on Haptic Audio Visual Environments and Games. HAVE 2008. IEEE, pp 13–18
Meers S, Ward K (2007) Haptic Gaze-Tracking Based Perception of Graphical User Interfaces. In: Information Visualization. IEEE, pp 713–719
Otsuka K, Takemae Y, Yamato J (2005) A Probabilistic Inference of Multiparty-Conversation Structure Based on Markov-Switching Models of Gaze Patterns, Head Directions, and Utterances. In: Proceedings of the 7th International Conference on Multimodal Interfaces. ACM, pp 191–198
Panchanathan S, Chakraborty S, McDaniel T (2016) Social interaction assistant: a person-centered approach to enrich social interactions for individuals with visual impairments. IEEE J Sel Top Signal Process 10(5):
Shilkrot R, Huber J, Liu C, Maes P, Nanayakkara SC (2014) A Wearable Text-Reading Device for the Visually-Impaired. In: CHI’14 Extended Abstracts on Human Factors in Computing Systems. ACM, pp 193–194
Shinohara K, Wobbrock JO (2011) In the Shadow of Misperception: Assistive Technology Use and Social Interactions. In: Proceedings of CHI. ACM, pp 705–714
Stiefelhagen R, Yang J, Waibel A (2002) Modeling focus of attention for meeting indexing based on multiple cues. IEEE T Neural Netw 13(4):928–938
Venkatesh V, Morris MG, Davis GB, Davis FD (2003) User acceptance of information technology: toward a unified view. MIS Quart :425–478
Velázquez R (2010) Wearable Assistive Devices for the Blind. In: Wearable and Autonomous Biomedical Devices and Systems for Smart Environment. Springer, pp 331–349
Voit M, Stiefelhagen R (2006) Tracking Head Pose and Focus of Attention with Multiple Far-Field Cameras. In: Proceedings of the 8th International Conference on Multimodal Interfaces. ACM, pp 281–286
Yang B, Chen S (2013) A comparative study on Local Binary Pattern (LBP) based face recognition: LBP histogram versus LBP image. Neurocomputing 120:365–379
Yi C, Flores RW, Chincha R, Tian YL (2013) Finding objects for assisting blind people. Netw Model Anal Health Inform Bioinform 2(2):71–79
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Sarfraz, M., Constantinescu, A., Zuzej, M. et al. A Multimodal Assistive System for Helping Visually Impaired in Social Interactions. Informatik Spektrum 40, 540–545 (2017). https://doi.org/10.1007/s00287-017-1077-7
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00287-017-1077-7