Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/2384916.2384935acmconferencesArticle/Chapter ViewAbstractPublication PagesassetsConference Proceedingsconference-collections
research-article

Learning non-visual graphical information using a touch-based vibro-audio interface

Published: 22 October 2012 Publication History

Abstract

This paper evaluates an inexpensive and intuitive approach for providing non-visual access to graphic material, called a vibro-audio interface. The system works by allowing users to freely explore graphical information on the touchscreen of a commercially available tablet and synchronously triggering vibration patterns and auditory information whenever an on-screen visual element is touched. Three studies were conducted that assessed legibility and comprehension of the relative relations and global structure of a bar graph (Exp 1), Pattern recognition via a letter identification task (Exp 2), and orientation discrimination of geometric shapes (Exp 3). Performance with the touch-based device was compared to the same tasks performed using standard hardcopy tactile graphics. Results showed similar error performance between modes for all measures, indicating that the vibro-audio interface is a viable multimodal solution for providing access to dynamic visual information and supporting accurate spatial learning and the development of mental representations of graphical material.

References

[1]
BLISS, J.C., KATCHER, M.H., ROGERS, C.H., and SHEPARD, R.P., 1970. Optical-to-tactile image conversion for the blind. IEEE Transactions on Man-Machine Systems 11, 58--65.
[2]
BREWSTER, S.A., 2002. Visualization tool for blind people using multiple modalities. Disability and Rehabilitation Technology 24, 11--12, 613--621.
[3]
COHEN, R.F., YU, R., MEACHAM, A., and SKAFF, J., 2005. Plumb: displaying graphs to the blind using an active auditory interface. In Proceedings of the 7th international ACM SIGACCESS conference on Computers and accessibility (ASSETS05) ACM Press, New York, NY, 182--183.
[4]
FERRES, L., LINDGAARD, G., and SUMEGI, L., 2010. Evaluating a tool for improving accessibility to charts and graphs. In Proceedings of the 12th international ACM SIGACCESS conference on Computers and accessibility ACM, New York, NY, Orlando, Fl, 83--90.
[5]
GIUDICE, N.A., BETTY, M.R., and LOOMIS, J.M., 2011. Functional equivalence of spatial images from touch and vision: Evidence from spatial updating in blind and sighted individuals. Journal of Experimental Psychology: Learning, Memory and Cognition 37, 3, 621--634.
[6]
GIUDICE, N.A. and LEGGE, G.E., 2008. Blind navigation and the role of technology. In The Engineering Handbook of Smart Technology for Aging, Disability, and Independence, A. HELAL, M. MOKHTARI and B. ABDULRAZAK (Eds). John Wiley & Sons, 479--500.
[7]
GONCU, C. and MARRIOTT, K., 2011. GraVVITAS: Generic Multi-touch Presentation of Accessible Graphics. In Proceedings of INTERACT2011 Springer, Berlin / Heidelberg 30--48.
[8]
HOGGAN, E., BREWSTER, S.A., and JOHNSTON, J., 2008. Investigating the Effectiveness of Tactile Feedback for Mobile Touchscreens. In Proceedings of ACM CHI2008 ACM Press, Addison Wesley, Florence, Italy, 1573--1582.
[9]
LOOMIS, J.M., 1974. Tactile letter recognition under different modes of stimulus presentation. Attention, perception, & psychophysics 16, 2, 401--408.
[10]
LOOMIS, J.M. and APKARIAN-STIELAU, P., 1976. A lateral masking effect in tactile and blurred visual letter recognition. Attention, Perception & Psychophysics 20, 221--226.
[11]
LOOMIS, J.M., KLATZKY, R.L., and GIUDICE, N.A., in press. Representing 3D space in working memory: Spatial images from vision, touch, hearing, and language. In Multisensory Imagery:Theory & Applications, S. LACEY and R. LAWSON (Eds). Springer, New York.
[12]
MACLEAN, K.E., 2008. Haptic Interaction Design for Everyday Interfaces. In Reviews of Human Factors and Ergonomics, M. CARSWELL (Ed). Human Factors and Ergonomics Society, Santa Monica, CA, 149--194.
[13]
NEES, M.A. and WALKER, B.N., 2009. Auditory Interfaces and Sonification. In The Universal Access Handbook, C. STEPHANIDIS (Ed). CRC Press, New York, 507--521.
[14]
PETIT, G., DUFRESNE, A., LEVESQUE, V., HAYWARD, V., and TRUDEAU, N., 2008. Refreshable tactile graphics applied to schoolbook illustrations for students with visual impairment. In Proceedings of the 10th international ACM SIGACCESS conference on Computers and accessibility (Assets08), 89--96.
[15]
PIETRZAK, T., CROSSAN, A., BREWSTER, S.A., MARTIN, B., and ISABELLE, P., 2009. Exploring Geometric Shapes with Touch. In Proceedings of INTERACT2009, 145--148.
[16]
POPPINGA, B., MAGNUSSON, C., PIELOT, M., and RASSMUS-GRÖHN, K., 2011. TouchOver map: audio-tactile exploration of interactive maps. In Proceedings of the 12th international conference on Human computer interaction with mobile devices ACM, Stockholm, Sweden, 545--550.
[17]
RAJA, M.K., 2011. The development and validation of a new smartphone based non-visual spatial interface for learning indoor layouts. Unpublished Masters thesis: Spatial Information Science and Engineering, University of Maine, Orono (advisor: N.A. Giudice).
[18]
SU, J., ROSENZWEIG, A., GOEL, A., DE LARA, D., and TRUONG, K.N., 2010. Timbremap: enabling the visually-impaired to use maps on touch-enabled devices. In Proceedings of the 12th international conference on Human computer interaction with mobile devices ACM, 17--26.
[19]
WALKER, B.N., 2002. Magnitude estimation of conceptual data dimensions for use in sonification. Journal of Experimental Psychology: Applied 8, 211--221.
[20]
WALKER, B.N. and MAUNEY, L., 2010. Universal Design of Auditory Graphs: A Comparison of Sonification Mappings for Visually Impaired and Sighted Listeners. ACM Transactions on Accessible Computing 2, 3, Article 12, 16 pages.
[21]
WORLD HEALTH ORGANIZATION, 2011. Visual impairment and blindness Fact Sheet, Retrieved 2012 from: http://www.who.int/mediacentre/factsheets/fs282/en/
[22]
WU, B., KLATZKY, R.L., and STETTEN, G.D., 2012. Mental visualization of objects from cross-sectional images. Cognition 123, 1, 33--49.
[23]
YATANI, K. and TRUONG, K.N., 2009. SemFeel: a user interface with semantic tactile feedback for mobile touch-screen devices. In Proceedings of the 22nd annual ACM symposium on User interface software and technology, 111--120.
[24]
YU, W. and BREWSTER, S.A., 2002. Comparing Two Haptic Interfaces for Multimodal Graph Rendering. In proceedings of IEEE VR2002, 10th Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, Florida.
[25]
YU, W. and BREWSTER, S.A., 2002. Evaluation of multimodal graphs for blind people. Universal Access in the Information Society 2, 2, 105--124,.
[26]
GIUDICE, N.A., MASON, S.J., & LEGGE, G.E., 2002. The relation of vision and touch: Spatial learning of small-scale layouts. Journal of Vision, 2(7), 522A.
[27]
SEARS, A. AND HANSON, V., 2011. Representing users in accessibility research. Proceedings of the 2011 annual conference on Human factors in computing systems.

Cited By

View all
  • (2024)The FlexiBoard: Tangible and Tactile Graphics for People with Vision ImpairmentsMultimodal Technologies and Interaction10.3390/mti80300178:3(17)Online publication date: 27-Feb-2024
  • (2024)Help-Seeking Situations Related to Visual Interactions on Mobile Platforms and Recommended Designs for Blind and Visually Impaired UsersJournal of Imaging10.3390/jimaging1008020510:8(205)Online publication date: 22-Aug-2024
  • (2024)Bridging the Gap of Graphical Information Accessibility in Education With Multimodal Touchscreens Among Students With Blindness and Low VisionJournal of Visual Impairment & Blindness10.1177/0145482X231217496117:6(453-466)Online publication date: 8-Jan-2024
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
ASSETS '12: Proceedings of the 14th international ACM SIGACCESS conference on Computers and accessibility
October 2012
321 pages
ISBN:9781450313216
DOI:10.1145/2384916
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 22 October 2012

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. accessibility (blind and visually-impaired)
  2. android programming
  3. assistive technology
  4. audio cues
  5. graphs and diagrams
  6. haptic cues
  7. information graphics

Qualifiers

  • Research-article

Conference

ASSETS '12
Sponsor:

Acceptance Rates

Overall Acceptance Rate 436 of 1,556 submissions, 28%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)58
  • Downloads (Last 6 weeks)2
Reflects downloads up to 02 Oct 2024

Other Metrics

Citations

Cited By

View all
  • (2024)The FlexiBoard: Tangible and Tactile Graphics for People with Vision ImpairmentsMultimodal Technologies and Interaction10.3390/mti80300178:3(17)Online publication date: 27-Feb-2024
  • (2024)Help-Seeking Situations Related to Visual Interactions on Mobile Platforms and Recommended Designs for Blind and Visually Impaired UsersJournal of Imaging10.3390/jimaging1008020510:8(205)Online publication date: 22-Aug-2024
  • (2024)Bridging the Gap of Graphical Information Accessibility in Education With Multimodal Touchscreens Among Students With Blindness and Low VisionJournal of Visual Impairment & Blindness10.1177/0145482X231217496117:6(453-466)Online publication date: 8-Jan-2024
  • (2024)AI-Vision: A Three-Layer Accessible Image Exploration System for People with Visual Impairments in ChinaProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36785378:3(1-27)Online publication date: 9-Sep-2024
  • (2024)Accessible Maps for the Future of Inclusive RidesharingProceedings of the 16th International Conference on Automotive User Interfaces and Interactive Vehicular Applications10.1145/3640792.3675736(106-115)Online publication date: 22-Sep-2024
  • (2024)Designing Unobtrusive Modulated Electrotactile Feedback on Fingertip Edge to Assist Blind and Low Vision (BLV) People in Comprehending ChartsProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642546(1-20)Online publication date: 11-May-2024
  • (2024)TADA: Making Node-link Diagrams Accessible to Blind and Low-Vision PeopleProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642222(1-20)Online publication date: 11-May-2024
  • (2023)ShapeSonic: Sonifying Fingertip Interactions for Non-Visual Virtual Shape PerceptionSIGGRAPH Asia 2023 Conference Papers10.1145/3610548.3618246(1-9)Online publication date: 10-Dec-2023
  • (2023)Comparing Natural Language and Vibro-Audio Modalities for Inclusive STEM Learning with Blind and Low Vision UsersProceedings of the 25th International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3597638.3608429(1-17)Online publication date: 22-Oct-2023
  • (2023)TouchPilot: Designing a Guidance System that Assists Blind People in Learning Complex 3D StructuresProceedings of the 25th International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3597638.3608426(1-18)Online publication date: 22-Oct-2023
  • Show More Cited By

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media