Nothing Special   »   [go: up one dir, main page]

skip to main content
research-article
Public Access

Principles for Designing Large-Format Refreshable Haptic Graphics Using Touchscreen Devices: An Evaluation of Nonvisual Panning Methods

Published: 03 February 2017 Publication History

Abstract

Touchscreen devices, such as smartphones and tablets, represent a modern solution for providing graphical access to people with blindness and visual impairment (BVI). However, a significant problem with these solutions is their limited screen real estate, which necessitates panning or zooming operations for accessing large-format graphical materials such as maps. Non-visual interfaces cannot directly employ traditional panning or zooming techniques due to various perceptual and cognitive limitations (e.g., constraints of the haptic field of view and disorientation due to loss of one's reference point after performing these operations). This article describes the development of four novel non-visual panning methods designed from the onset with consideration of these perceptual and cognitive constraints. Two studies evaluated the usability of these panning methods in comparison with a non-panning control condition. Results demonstrated that the exploration, learning, and subsequent spatial behaviors were similar between panning and non-panning conditions, with one panning mode, based on a two-finger drag technique, revealing the overall best performance. Findings provide compelling evidence that incorporating panning operations on touchscreen devices -- the fastest growing computational platform among the BVI demographic -- is a viable, low-cost, and immediate solution for providing BVI people with access to a broad range of large-format digital graphical information.

References

[1]
Nick Bilton. 2013. Disruptions: Visually impaired turn to smartphones to see their world. New York Times.
[2]
James Charles Bliss, Michael H. Katcher, Charles H. Rogers, and Raymond P. Shepard. 1970. Optical-to-tactile image conversion for the blind. Man-Machine Syst. IEEE Trans. 11, 1 (1970), 58--65.
[3]
Steven M. Casey. 1978. Cognitive mapping by the blind. J. Vis. Impair. Blind. 72, 8 (1978), 297--301.
[4]
Vasilios G. Chouvardas, Amalia N. Miliou, and Miltiadis K. Hatalis. 2005. Tactile display applications: A state of the art survey. In Proceedings of the 2nd Balkan Conference in Informatics. Citeseer. 290--303.
[5]
H. Couclelis, R. G. Golledge, N. Gale, and W. Tobler. 1987. Exploring the anchor-point hypothesis of spatial cognition. J. Environ. Psychol. 7, 2 (1987), 99--122.
[6]
C. A. Curcio, K. R. Sloan, R. E. Kalina, and A. E. Hendrickson. 1990. Human photoreceptor topography. J. Comp. Neurol. 292, 4 (1990), 497.
[7]
Yvonne Eriksson. 1998. Tactile pictures. Pictorial Representations for the Blind 1784-1940.
[8]
Alinda Friedman and Bernd Kohler. 2003. Bidimensional regression: Assessing the configural similarity and accuracy of cognitive maps and other two-dimensional data sets. Psychol. Methods 8, 4 (2003), 468--491.
[9]
Nicholas A. Giudice, Maryann R. Betty, and Jack M. Loomis. 2011. Functional equivalence of spatial images from touch and vision: Evidence from spatial updating in blind and sighted individuals. J. Exp. Psychol. Learn. Mem. Cogn. 37, 3 (May 2011), 621--34.
[10]
Nicholas A. Giudice and Gordon E. Legge. 2008. Blind navigation and the role of technology. In Engineering Handbook of Smart Technology for Aging, Disability, and Independence, A. Helal, M. Mokhtari, and B. Abdulrazak (Eds.). Wiley, 479--500.
[11]
Nicholas A. Giudice, Hari Prasath Palani, Eric Brenner, and Kevin M. Kramer. 2012. Learning non-visual graphical information using a touch-based vibro-audio interface. In Proeedings of the.14th International ACM SIGACCESS Conference on Computers and Accessibility. ACM, New York, 103--110.
[12]
Cagatay Goncu and Kim Marriott. 2015. GraCALC: An accessible graphing calculator. In Proceedings of the 17th International ACM SIGACCESS Conference on Computers and Accessibility. 311--312.
[13]
Cagatay Goncu and Kim Marriott. 2011. GraVVITAS: Generic multi-touch presentation of accessible graphics. Lecture Notes in Computer Science. Springer, Berlin, vol. 6946, 30--48.
[14]
C. Hasser. 1995. HAPTAC: A haptic tactile display for the presentation of two-dimensional virtual or remote environments. (1995).
[15]
Eve Hoggan, Andrew Crossan, Stephen Brewster, and Topi Kaaresoja. 2009. Audio or tactile feedback: Which modality when? (2009), 2--5.
[16]
HyperBraille. 2015. Hyperbraille: The project. Retrieved January 1, 2015 from http://hyperbraille.de/project/.
[17]
Infographics. 2015. Infographics. Retrieved from http://en.wikipedia.org/wiki/Infographic.
[18]
L. A. Jones and S. J. Lederman. 2006. Human Hand Function. Oxford University Press.
[19]
Peter Kammermeier and Günther Schmidt. 2002. Application-specific evaluation of tactile array displays for the human fingertip. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, 2002. 2937--2942.
[20]
Roberta L. Klatzky, Nicholas A. Giudice, Christopher R. Bennett, and Jack M. Loomis. 2014. Touch-screen technology for the dynamic display of 2D spatial information without vision: Promise and progress. Multisens. Res. 27, 5--6 (2014), 359--378.
[21]
Makoto Kobayashi and Tetsuya Watanabe. 2002. A tactile display system equipped with a pointing device -- MIMIZU. In Proceedings of the International Conference on Computers Helping People with Special Needs. 527--534.
[22]
Wiebke Kohlmann and Ulrike Lucke. 2015. Alternative concepts for accessible virtual classrooms for blind users. In Proceedings of the IEEE 15th International Conference on Advanced Learning Technology Advanced Technology Supporting Open Access to Formal and Informal Learning (ICALT’15). 413--417.
[23]
Jack M. Loomis, Roberta L. Klatzky, and S. J. Lederman. 1991. Similarity of tactual and visual picture recognition with limited field of view. Perception 20 (1991), 167--177.
[24]
C. Magnuson and K. Rassmus-Grohn. 2003. Non-visual zoom and scrolling operations in a virtual haptic environment. In Eurohaptics. 6--9.
[25]
Charlotte Magnusson, T. Gutierrez, and Kirsten Rassmus-Gröhn. 2007. Test of pan and zoom tools in visual and non-visual audio haptic environments. Enactive 7 (2007).
[26]
Christopher McAdam and Stephen Brewster. 2011. Using mobile phones to interact with tabletop computers. In Proceedings of the ACM International Conference Interacting Tabletop Surfaces (ITS’11). 232--241.
[27]
Joe Mullenbach, Craig Shultz, J. Edward Colgate, and Anne Marie Piper. 2014. Exploring affective communication through variable friction surface haptics. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 3963--3972.
[28]
Atsushi Nishi and Ryoji Fukuda. 2006. Graphic editor for visually impaired users. In Proceedings of the International Conference on Computers Helping People with Special Needs (2006), 1139--1146.
[29]
Sile O’Modhrain, Nicholas A. Giudice, John A. Gardner, and Gordon E. Legge. 2015. Designing media for visually-impaired users of refreshable touch displays: Possibilities and pitfalls. Trans. Haptics 8, 3 (2015), 248--257.
[30]
R. Passini and G. Proulx. 1988. Wayfinding without vision: An experiment with congenitally totally blind people. Environ. Behav. 20, 2 (1988), 227--252.
[31]
Perkins Maps. 2015. Perkins maps. Retrieved January 1, 2015 from http://www.perkins.org/resources/scout/education/geography/tactile-maps.html.
[32]
Phantom. 2015. Phantom omni. Retrieved January 1, 2013 from http://geomagic.com/en/products-landing-pages/sensable.
[33]
Stephan Pölzer, Andreas Kunz, Ali Alavi, and Klaus Miesenberger. 2016. An accessible environment to integrate blind participants into brainstorming sessions user studies. In Proceedings of the International Conference on Computers Helping People with Special Needs (2016), 587--593.
[34]
Hari Prasath Palani and Nicholas A. Giudice. 2014. Evaluation of non-visual panning operations using touch-screen devices. In Proceedings of the 16th International ACM SIGACCESS Conference on Computers and Accessibility. ACM.
[35]
Hariprasath Palani, Uro Giudice, and Nicholas A. Giudice. 2016. Evaluation of non-visual zooming operations on touchscreen devices. In International Conference on Universal Access in Human-Computer Interaction. Springer International Publishing, 162--174.
[36]
Denise Prescher and Gerhard Weber. 2016. Locating widgets in different tactile information visualizations. In Proceedings of the International Conference on Computers for Handicapped Persons (2016), 100--107.
[37]
Monoj Kumar Raja. 2011. The Development and Validation of a New Smartphone Based Non-Visual Spatial Interface for Learning Indoor Layouts. University of Maine.
[38]
Ravi Rastogi, T. V. Dianne Pawluk, and Jessica Ketchum. 2013. Intuitive tactile zooming for graphics accessed by individuals who are blind and visually impaired. IEEE Trans. Neural Syst. Rehabil. Eng. 21, 4 (July 2013), 655--63.
[39]
Matthias Rauterberg. 1992. An empirical comparison of menu-selection (CUI) and desktop (GUI) computer programs carried out by beginners and experts. Behav. Inf. Technol. 11, 4 (1992), 227--236.
[40]
J. Rowell and S. Ungar. 2003a. The world of touch: An international survey of tactile maps. Part 1: production. Br. J. Vis. Impair. 21, 3 (2003), 98--104.
[41]
J. Rowell and S. Ungar. 2003b. The world of touch: An international survey of tactile maps. Part 2: design. Br. J. Vis. Impair. 21, 3 (2003), 105--110.
[42]
E. K. Sadalla, W. J. Burroughs, and L. J. Staplin. 1980. Reference points in spatial cognition. J. Exp. Psychol. Hum. Learn. 6, 5 (September 1980), 516--28.
[43]
David W. Schloerb, Orly Lahav, Joseph G. Desloge, and Mandayam A. Srinivasan. 2010. BlindAid: Virtual environment system for self-reliant trip planning and orientation and mobility training. In Proceedings of the Haptic Symposium. IEEE, 363--370.
[44]
Michael Schmidt and Gerhard Weber. 2009. Multitouch haptic interaction. In Universal Access in Human-Computer Interaction. Springer, Berlin, 574--582.
[45]
Waltraud Schweikhardt and Klöper. 1984. Rechnerunterstützte aufbereitung von bildshirmtext-grafiken in eine tastbare darstellung Institut für Infor- matik Universität Stuttgart, Institutsbericht. 1--16.
[46]
Andrew Sears and Vicki L. Hanson. 2012. Representing users in accessibility research. ACM Trans. Access. Comput. 4, 2 (2012), 1--6.
[47]
Ben Shneiderman, Catherine Plaisant, Maxine Cohen, and Steven Jacobs. 2009. Designing the User Interface: Strategies for Effective Human-Computer Interaction, 5th ed. Addison-Wesley Publishing Company.
[48]
Calle Sjöström. 2002. Non-Visual Haptic Interaction Design -- Guidelines and Applications. Lund University, Certec.
[49]
Mark Smiciklas. 2012. The power of infographics: Using pictures to communicate and connect with your audience. Que Publishing.
[50]
Nancy Staggers and David Kobus. 2000. Comparing response time, errors, and satisfaction between text-based and graphical user interfaces during nursing order tasks. J. Am. Med. Informatics Assoc. 7, 2 (2000), 164--176.
[51]
J. Su, A. Rosenzweig, A. Goel, E. de Lara, and K. N. Truong. 2010. Timbremap: Enabling the visually-impaired to use maps on touch-enabled devices. In Proceedings of the 12th International Conference on Human Computer Interaction with mobile Devices and Services. ACM, 17--26.
[52]
Noboru Takagi, Shingo Morii, and Tatsuo Motoyoshi. 2015. A study of input and scrolling methods for tactile graphics editors available for visually impaired people. In Proceedings of the IEEE International Conference on Systems, Man, and Cybernetics (SMC 2015) (2015), 2333--2337.
[53]
Barker Temple. 1990. The Benefits of the Graphical User Interface: A Report on New Primary Research. Temple, Barker, 8 Sloane Inc.
[54]
W. R. Tobler. 1994. Bidimensional regression. Geogr. Anal. 26, 166 (1994), 186--212.
[55]
Thorsten Völkel, Gerhard Weber, and Ulrich Baumann. 2008. Tactile graphics revised: The novel brailledis 9000 pin-matrix device with multitouch input. In Proceedings of the International Conference on Computers Helping People with Special Needs. Springer, Berlin, 835--842.
[56]
Bruce N. Walker and Lisa M. Mauney. 2010. Universal design of auditory graphs: A comparison of sonification mappings for visually impaired and sighted listeners. ACM Trans. Access. Comput. 2, 3 (2010), 1--16.
[57]
S. Walker and J. K. Salisbury. 2003. Large haptic topographic maps: Marsview and the proxy graph algorithm. In Proceedings of ACM SIGGRAPH. 83--92.
[58]
David Waller, Daniel R. Montello, Anthony E. Richardson, and Mary Hegarty. 2002. Orientation specificity and spatial updating of memories for layouts. J. Exp. Psychol. Learn. Mem. Cogn. 28, 6 (2002), 1051--1063.
[59]
M. W. A. Wijntjes, Thijs van Lienen, Ilse M. Verstijnen, and A. M. L. Kappers. 2008. Look what I have felt: Unidentified haptic line drawings are identified after sketching. Acta Psychol. (Amst). 128, 2 (2008), 255--263.
[60]
Julie R. Williamson, Andrew Crossan, and Stephen Brewster. 2011. Multimodal mobile interactions: Usability studies in real world settings. In Proceedings of the 13th International Conference on Multimodal Interfaces (ICMI’11). 361--368.
[61]
World Health Organization. 2011. Visual impairment and blindness Fact Sheet. Retrieved from http://www.who.int/mediacentre/factsheets/fs282/en/.
[62]
Cheng Xu, Ali Israr, Ivan Poupyrev, Olivier Bau, and Chris Harrison. 2011. Tactile display for the visually impaired using teslatouch. In Proceedings of the 2011 CHI Conference on Extended Abstracts on Human Factors on Computing Systems (CHI EA’11). 317--322.
[63]
Limin Zeng and Gerhard Weber. 2010. Audio-haptic browser for a geographical information system. In Proceedings of the International Conference on Computers Helping People with Special Needs. Springer, Berlin, 466--473.

Cited By

View all
  • (2024)Accessible Maps for the Future of Inclusive RidesharingProceedings of the 16th International Conference on Automotive User Interfaces and Interactive Vehicular Applications10.1145/3640792.3675736(106-115)Online publication date: 22-Sep-2024
  • (2024)Designing Unobtrusive Modulated Electrotactile Feedback on Fingertip Edge to Assist Blind and Low Vision (BLV) People in Comprehending ChartsProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642546(1-20)Online publication date: 11-May-2024
  • (2023)Comparing Natural Language and Vibro-Audio Modalities for Inclusive STEM Learning with Blind and Low Vision UsersProceedings of the 25th International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3597638.3608429(1-17)Online publication date: 22-Oct-2023
  • Show More Cited By

Index Terms

  1. Principles for Designing Large-Format Refreshable Haptic Graphics Using Touchscreen Devices: An Evaluation of Nonvisual Panning Methods

        Recommendations

        Comments

        Please enable JavaScript to view thecomments powered by Disqus.

        Information & Contributors

        Information

        Published In

        cover image ACM Transactions on Accessible Computing
        ACM Transactions on Accessible Computing  Volume 9, Issue 3
        September 2017
        81 pages
        ISSN:1936-7228
        EISSN:1936-7236
        DOI:10.1145/3040970
        Issue’s Table of Contents
        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        Published: 03 February 2017
        Accepted: 01 December 2016
        Revised: 01 December 2016
        Received: 01 July 2016
        Published in TACCESS Volume 9, Issue 3

        Permissions

        Request permissions for this article.

        Check for updates

        Author Tags

        1. Accessibility (blind and visually impaired)
        2. assistive technology
        3. auditory cues
        4. haptic cues
        5. non-visual maps
        6. touchscreens
        7. vibro-audio interface

        Qualifiers

        • Research-article
        • Research
        • Refereed

        Funding Sources

        Contributors

        Other Metrics

        Bibliometrics & Citations

        Bibliometrics

        Article Metrics

        • Downloads (Last 12 months)100
        • Downloads (Last 6 weeks)17
        Reflects downloads up to 22 Nov 2024

        Other Metrics

        Citations

        Cited By

        View all
        • (2024)Accessible Maps for the Future of Inclusive RidesharingProceedings of the 16th International Conference on Automotive User Interfaces and Interactive Vehicular Applications10.1145/3640792.3675736(106-115)Online publication date: 22-Sep-2024
        • (2024)Designing Unobtrusive Modulated Electrotactile Feedback on Fingertip Edge to Assist Blind and Low Vision (BLV) People in Comprehending ChartsProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642546(1-20)Online publication date: 11-May-2024
        • (2023)Comparing Natural Language and Vibro-Audio Modalities for Inclusive STEM Learning with Blind and Low Vision UsersProceedings of the 25th International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3597638.3608429(1-17)Online publication date: 22-Oct-2023
        • (2023)The Accessibility of Data Visualizations on the Web for Screen Reader Users: Practices and Experiences During COVID-19ACM Transactions on Accessible Computing10.1145/355789916:1(1-29)Online publication date: 29-Mar-2023
        • (2023)ImageAssist: Tools for Enhancing Touchscreen-Based Image Exploration Systems for Blind and Low Vision UsersProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581302(1-17)Online publication date: 19-Apr-2023
        • (2023)A Systematic Mapping Study on Accessibility in Data Visualizations for Visually Impaired Users2023 11th International Conference in Software Engineering Research and Innovation (CONISOFT)10.1109/CONISOFT58849.2023.00032(186-196)Online publication date: 6-Nov-2023
        • (2023)A scrolling performance model based on two-dimensional touch peephole interactionsBehaviour & Information Technology10.1080/0144929X.2023.222677743:9(1758-1768)Online publication date: 20-Jun-2023
        • (2022)Design of a “Cobot Tactile Display” for Accessing Virtual Diagrams by Blind and Visually Impaired UsersSensors10.3390/s2212446822:12(4468)Online publication date: 13-Jun-2022
        • (2022)Route Descriptions, Spatial Knowledge and Spatial Representations of Blind and Partially Sighted People: Improved Design of Electronic Travel AidsACM Transactions on Accessible Computing10.1145/354907715:4(1-46)Online publication date: 11-Nov-2022
        • (2022)Usability Evaluation of Assistive Technology for ICT Accessibility: Lessons Learned with Stroke Patients and Able-Bodied Participants Experiencing a Motor Dysfunction SimulationInformation Systems and Neuroscience10.1007/978-3-031-13064-9_35(349-359)Online publication date: 3-Dec-2022
        • Show More Cited By

        View Options

        View options

        PDF

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader

        Login options

        Full Access

        Media

        Figures

        Other

        Tables

        Share

        Share

        Share this Publication link

        Share on social media