Avoid common mistakes on your manuscript.
1 Introduction
Design and perception of sonification can be considered from the viewpoint of psychoacoustics and auditory scene analysis, but also from the viewpoint of aesthetics and user experience. The theme of the Interactive Sonification Workshop (ISon 2022) was “Psychoacoustics in the Loop” [1], underlining the usefulness of sonification for psychoacoustic experiments and the potential of psychoacoustic methods in the sound design and evaluation process of sonification. The theme of the International Conference on Auditory Display (ICAD 2023) was “Sonification for the Masses” [2], pointing towards the necessity of both interpretable and appealing sound design in sonification. The themes of these two major events in the field of sonification and auditory display underline the relevance of Design and perception of interactive sonification. Sonification concepts can start with the data to present, with a use case in mind, with ideas about the user experience to create, or with the definition of a system that offers a manageable amount of acoustic and interaction parameters. This special issue covers the technical side, human factors, as well as aesthetic and creative aspects of sonification design and perception.
2 In this issue
Six submissions were selected for publication in this special issue, which is thought to advance the field of interactive sonification in general, and the design and perception of interactive sonification in particular.
Jason Sterkenburg, Steven Landry, Shabnam FakhrHosseini and Myounghoon Jeon [3] evaluate their interface from [4], which combines air gestures tracked with a motion sensor as a Human Interface Device with an auditory menu as a User Interface. Interactive experiments in a driving simulator reveal that this combination is less prone to visual distractions than touch screens with a Graphical User Interface, leading to a better driver performance and a lower perceived workload when interacting with a menu while driving.
Adrian Benigno Latupeirissa and Roberto Bresin [5] introduce PepperOSC, an interface to equip Pepper and NAO robots with an interactive sonification functionality. The interface streams kinematic data as Open Sound Control messages, allowing movement sonification as a means of human-robot interaction. With different sound designs [6], PepperOSC has been used in a University course project and a museum installation.
Simon Linke, Rolf Bader & Robert Mores [7] introduce the Impulse Pattern Formulation (IPF) [8] as a Model-Based Sonification approach. Even though the IPF has only very few parameters, the recursive function can describe various interactive systems, like physical, neural, and social systems. The IPF can produce multiple regimes. An interactive experiment shows that users intuitively hear a regime change, which can represent absolute values, such as a coordinate origin or thresholds.
Tim Ziemer [9] demonstrates that the psychoacoustic sonification design for three dimensions [10] enables naïve users to navigate through a three-dimensional space with high precision in a surgical use case. This example of applied psychoacoustics in the design of interactive sonification may pave the way to supplementing and, partly, even substituting visualizations in multiple applications.
Mariana Seiça, Licínio Roque, Pedro Martins & F. Amílcar Cardoso [11] provide an extensive treatise of aesthetics in sonification design and perception. This expands their previous work on sonification aesthetics presented at the International Conference on Auditory Display (ICAD 2021) [12]. They argue that a design of sonifications becomes a design of experience. They conclude with an open call for action to reframe the sonification field into novel design spaces.
Joe Fitzpatrick & Flaithri Neff [13] propose to consider principles of auditory perception in terms of psychoacoustics and auditory scene analysis to produce Perceptually Congruent Sonification (PerCS). Even though PerCS did not significantly improve measured performance or subjective difficulty interpreting auditory line charts, their guidelines certainly help sonification designers choose perceptually meaningful parameters and ranges that may improve interpretability in more complex sonification scenarios. This paper is a reworked and substantially extended version of their presentation at the International Conference on Auditory Display (ICAD 2023) [14].
With this special issue, we hope that these articles will offer novel, interesting perspectives on the field and inspire the readers’ research.
References
Rönnberg, N., Lenzi, S., Ziemer, T., Hermann, T., Bresin, R. (eds.): Psychoacoustics in the Loop. In: Proceedings of the 7th Interactive Sonification Workshop, Delmenhorst, Germany (2022). https://doi.org/10.5281/zenodo.7552268
Weger, M., Ziemer, T., Rönnberg, N. (eds.): Sonification for the Masses: Proceedings of the 28th Annual International Conference on Auditory Display, Norrköping, Sweden (2023). https://doi.org/10.21785/icad2023.000
Sterkenburg J, Landry S, FakhrHosseini S, Jeon M (2023) In-vehicle air gesture design: impacts of display modality and control orientation. J Multimodal User Interfaces. https://doi.org/10.1007/s12193-023-00415-8
Sterkenburg J, Landry S, Jeon M (2019) Design and evaluation of auditory-supported air gesture controls in vehicles. J. Multimodal User Interfaces 13:55–70. https://doi.org/10.1007/s12193-019-00298-8
Latupeirissa AB, Bresin R (2023) PepperOSC: enabling interactive sonification of a robot’s expressive movement. J Multimodal User Interfaces. https://doi.org/10.1007/s12193-023-00414-9
Latupeirissa AB, Panariello C, Bresin R (2023) Probing aesthetics strategies for robot sound: complexity and materiality in movement sonification. J Hum-Robot Interact. https://doi.org/10.1145/3585277
Linke S, Bader R, Mores R (2023) Model-based sonification based on the impulse pattern formulation. J Multimodal User Interfaces. https://doi.org/10.1007/s12193-023-00423-8
Bader R (2013) Nonlinearities and Synchronization in Musical Acoustics and Music Psychology. Springer, Berlin. https://doi.org/10.1007/978-3-642-36098-5
Ziemer T (2023) Three-dimensional sonification as a surgical guidance tool. J Multimodal User Interfaces. https://doi.org/10.1007/s12193-023-00422-9
Ziemer, T., Schultheis, H.: Three orthogonal dimensions for psychoacoustic sonification (2019). arXiv preprint. https://doi.org/10.48550/arXiv.1912.00766
Seiça M, Roque L, Martins P, Cardoso FA (2023) An interdisciplinary journey towards an aesthetics of sonification experience. J Multimodal User Interfaces. https://doi.org/10.1007/s12193-023-00416-7
Seiça, M., Roque, L., Martins, P., Cardoso, F.A.: A systemic perspective for sonification aesthetics. In: Proceedings of the 26th International Conference on Auditory Display (ICAD 2021). https://doi.org/10.21785/icad2021.033
Fitzpatrick J, Neff F (2023) Perceptually congruent sonification of auditory line charts. J Multimodal User Interfaces. https://doi.org/10.1007/s12193-023-00413-w
Fitzpatrick, J., Neff, F.: Identifying a perceptually congruent frequency range for auditory line charts. In: Proceedings of the 28th International Conference on Auditory Display (ICAD 2023), Norrköping, Sweden (2023). https://doi.org/10.21785/icad2023.1904
Funding
Open Access funding enabled and organized by Projekt DEAL.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Ziemer, T., Lenzi, S., Rönnberg, N. et al. Introduction to the special issue on design and perception of interactive sonification. J Multimodal User Interfaces 17, 213–214 (2023). https://doi.org/10.1007/s12193-023-00425-6
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12193-023-00425-6