Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/1322192.1322222acmconferencesArticle/Chapter ViewAbstractPublication Pagesicmi-mlmiConference Proceedingsconference-collections
research-article

Designing audio and tactile crossmodal icons for mobile devices

Published: 12 November 2007 Publication History

Abstract

This paper reports an experiment into the design of crossmodal icons which can provide an alternative form of output for mobile devices using audio and tactile modalities to communicate information. A complete set of crossmodal icons was created by encoding three dimensions of information in three crossmodal auditory/tactile parameters. Earcons were used for the audio and Tactons for the tactile crossmodal icons. The experiment investigated absolute identification of audio and tactile crossmodal icons when a user is trained in one modality and tested in the other (and given no training in the other modality) to see if knowledge could be transferred between modalities. We also compared performance when users were static and mobile to see any effects that mobility might have on recognition of the cues. The results showed that if participants were trained in sound with Earcons and then tested with the same messages presented via Tactons they could recognize 85% of messages when stationary and 76% when mobile. When trained with Tactons and tested with Earcons participants could accurately recognize 76.5% of messages when stationary and 71% of messages when mobile. These results suggest that participants can recognize and understand a message in a different modality very effectively. These results will aid designers of mobile displays in creating effective crossmodal cues which require minimal training for users and can provide alternative presentation modalities through which information may be presented if the context requires.

References

[1]
Barnard, L., Yi, J. S., Jacko, J. A. and Sears, A., An Empirical Comparison of Use-in-Motion Evaluation Scenarios for Mobile Computing Devices, in International Journal of Human-Computer Studies 62 (2005), 487--520.
[2]
Blattner, M. M., Sumikawa, D. A. and Greenberg, R. M., Earcons and Icons: Their Structure and Common Design Principles, in Human Computer Interaction 4(1) (1989), 11--44.
[3]
Brewster, S. A. and Brown, L. M., Tactons: Structured Tactile Messages for Non-Visual Information Display, in Proc AUI Conference 2004, ACS (2004), 15--23.
[4]
Brewster, S. A., Wright, P. C. and Edwards, A. D. N., An evaluation of Earcons for use in auditory human-computer interfaces, in Proc InterCHI'93, Amsterdam, ACM Press (1993), 222--227.
[5]
Brown, L. M. and Brewster, S. A., Multidimensional Tactons for Non-Visual Information Display in Mobile Devices, in Proc MobileHCI 2006, ACM Press (2006), 231--238.
[6]
Brown, L. M., Brewster, S. A. and Purchase, H. C., A First Investigation into the Effectiveness of Tactons, in Proc WorldHaptics 2005, IEEE (2005), 167--176.
[7]
Cholewiak, R. W. and Craig, J. C., Vibrotactile Pattern Recognition and Discrimination at Several Body Sites, in Perception and Psychophysics 35 (1984), 503--514.
[8]
Erp, J. B., Tactile Navigation Display, in First international Workshop on Haptic Human -- Computer Interaction, Lecture Notes in Computer Science 2058 (2001), 165--173.
[9]
Hoggan, E. and Brewster, S. A., Crossmodal Icons for Information Display, in Proc ACM CHI '06 Extended Abstracts, ACM Press (2006), 857--862.
[10]
Hoggan, E. and Brewster, S. A., Crossmodal Spatial Location: Initial Experiments, in Proc NordiCHI '06, Norway, ACM Press (2006), 469--472.
[11]
Kjeldskov, J. and Stage, J., New Techniques for Usability Evaluation of Mobile Systems in International Journal of Human-Computer Studies 60 (2004), 599--620.
[12]
Lenay, C., Canu, S. and Villon, P., Technology and Perception: The Contribution of Sensory Substitution Systems, in Proc ICCT, IEEE (1997),
[13]
Lewkowicz, D. J., The Development of Intersensory Temporal Perception: An Epigenetic Systems/Limitations View, in Psychological Bulletin 126 (2000), 281--308.
[14]
McGookin, D. and Brewster, S. A., Understanding Concurrent Earcons: Applying Auditory Scene Analysis Principles to Concurrent Earcon Recognition, in ACM Transactions on Applied Perception 1(2) (2004), 130--155.
[15]
Mortimer, B., Zets, G. and Cholewiak, R. W., Vibrotactile Transduction, submitted to Journal of the Acoustic Society of America (2006).
[16]
Sawhney, N. and Schmandt, C., Nomadic Radio: Speech and Audio Interaction for Contextual Messaging in Nomadic Environments, in Proc ACM Transactions on Computer Human Interaction, ACM Press (2000), 353--383.
[17]
Tan, H. Z. and Pentland, A., Tactual Displays for Wearable Computing, in Proc the 1st IEEE International Symposium on Wearable Computers, IEEE (1997), 225--230.

Cited By

View all
  • (2024)Sound-to-Touch Crossmodal Pitch Matching for Short SoundsIEEE Transactions on Haptics10.1109/TOH.2023.333822417:1(2-7)Online publication date: Jan-2024
  • (2024)It Sounds Cool: Exploring Sonification of Mid-Air Haptic Textures Exploration on Texture Judgments, Body Perception, and Motor BehaviourIEEE Transactions on Haptics10.1109/TOH.2023.332049217:2(237-248)Online publication date: 1-Apr-2024
  • (2022)Birdbox: Exploring the User Experience of Crossmodal, Multisensory Data RepresentationsProceedings of the 21st International Conference on Mobile and Ubiquitous Multimedia10.1145/3568444.3568455(12-21)Online publication date: 27-Nov-2022
  • Show More Cited By

Index Terms

  1. Designing audio and tactile crossmodal icons for mobile devices

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    ICMI '07: Proceedings of the 9th international conference on Multimodal interfaces
    November 2007
    402 pages
    ISBN:9781595938176
    DOI:10.1145/1322192
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 12 November 2007

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. crossmodal interaction
    2. earcons
    3. mobile interaction
    4. multimodal interaction
    5. tactons (tactile icons)

    Qualifiers

    • Research-article

    Conference

    ICMI07
    Sponsor:
    ICMI07: International Conference on Multimodal Interface
    November 12 - 15, 2007
    Aichi, Nagoya, Japan

    Acceptance Rates

    Overall Acceptance Rate 453 of 1,080 submissions, 42%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)52
    • Downloads (Last 6 weeks)6
    Reflects downloads up to 17 Dec 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Sound-to-Touch Crossmodal Pitch Matching for Short SoundsIEEE Transactions on Haptics10.1109/TOH.2023.333822417:1(2-7)Online publication date: Jan-2024
    • (2024)It Sounds Cool: Exploring Sonification of Mid-Air Haptic Textures Exploration on Texture Judgments, Body Perception, and Motor BehaviourIEEE Transactions on Haptics10.1109/TOH.2023.332049217:2(237-248)Online publication date: 1-Apr-2024
    • (2022)Birdbox: Exploring the User Experience of Crossmodal, Multisensory Data RepresentationsProceedings of the 21st International Conference on Mobile and Ubiquitous Multimedia10.1145/3568444.3568455(12-21)Online publication date: 27-Nov-2022
    • (2022)Characterising Soundscape Research in Human-Computer InteractionProceedings of the 2022 ACM Designing Interactive Systems Conference10.1145/3532106.3533458(1394-1417)Online publication date: 13-Jun-2022
    • (2022)First Steps Towards Designing Electrotactons: Investigating Intensity and Pulse Frequency as Parameters for Electrotactile CuesProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3501863(1-11)Online publication date: 29-Apr-2022
    • (2020)The Effect of Context on Small Screen and Wearable Device Users’ Performance - A Systematic ReviewACM Computing Surveys10.1145/338637053:3(1-44)Online publication date: 28-May-2020
    • (2020)It’s All in the TimingACM Transactions on Computer-Human Interaction10.1145/338635827:3(1-29)Online publication date: 31-May-2020
    • (2020)Haptic and Auditive Mesh Inspection for Blind 3D ModelersProceedings of the 22nd International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3373625.3417007(1-10)Online publication date: 26-Oct-2020
    • (2020)Haptic and audio interaction designJournal on Multimodal User Interfaces10.1007/s12193-020-00344-wOnline publication date: 7-Aug-2020
    • (2019)Can Changes in Heart Rate Variability Represented in Sound be Identified by Non-Medical Experts?Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems10.1145/3290607.3308456(1-6)Online publication date: 2-May-2019
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media