1 Introduction

Design and perception of sonification can be considered from the viewpoint of psychoacoustics and auditory scene analysis, but also from the viewpoint of aesthetics and user experience. The theme of the Interactive Sonification Workshop (ISon 2022) was “Psychoacoustics in the Loop” [1], underlining the usefulness of sonification for psychoacoustic experiments and the potential of psychoacoustic methods in the sound design and evaluation process of sonification. The theme of the International Conference on Auditory Display (ICAD 2023) was “Sonification for the Masses” [2], pointing towards the necessity of both interpretable and appealing sound design in sonification. The themes of these two major events in the field of sonification and auditory display underline the relevance of Design and perception of interactive sonification. Sonification concepts can start with the data to present, with a use case in mind, with ideas about the user experience to create, or with the definition of a system that offers a manageable amount of acoustic and interaction parameters. This special issue covers the technical side, human factors, as well as aesthetic and creative aspects of sonification design and perception.

2 In this issue

Six submissions were selected for publication in this special issue, which is thought to advance the field of interactive sonification in general, and the design and perception of interactive sonification in particular.

Jason Sterkenburg, Steven Landry, Shabnam FakhrHosseini and Myounghoon Jeon [3] evaluate their interface from [4], which combines air gestures tracked with a motion sensor as a Human Interface Device with an auditory menu as a User Interface. Interactive experiments in a driving simulator reveal that this combination is less prone to visual distractions than touch screens with a Graphical User Interface, leading to a better driver performance and a lower perceived workload when interacting with a menu while driving.

Adrian Benigno Latupeirissa and Roberto Bresin [5] introduce PepperOSC, an interface to equip Pepper and NAO robots with an interactive sonification functionality. The interface streams kinematic data as Open Sound Control messages, allowing movement sonification as a means of human-robot interaction. With different sound designs [6], PepperOSC has been used in a University course project and a museum installation.

Simon Linke, Rolf Bader & Robert Mores [7] introduce the Impulse Pattern Formulation (IPF) [8] as a Model-Based Sonification approach. Even though the IPF has only very few parameters, the recursive function can describe various interactive systems, like physical, neural, and social systems. The IPF can produce multiple regimes. An interactive experiment shows that users intuitively hear a regime change, which can represent absolute values, such as a coordinate origin or thresholds.

Tim Ziemer [9] demonstrates that the psychoacoustic sonification design for three dimensions [10] enables naïve users to navigate through a three-dimensional space with high precision in a surgical use case. This example of applied psychoacoustics in the design of interactive sonification may pave the way to supplementing and, partly, even substituting visualizations in multiple applications.

Mariana Seiça, Licínio Roque, Pedro Martins & F. Amílcar Cardoso [11] provide an extensive treatise of aesthetics in sonification design and perception. This expands their previous work on sonification aesthetics presented at the International Conference on Auditory Display (ICAD 2021) [12]. They argue that a design of sonifications becomes a design of experience. They conclude with an open call for action to reframe the sonification field into novel design spaces.

Joe Fitzpatrick & Flaithri Neff [13] propose to consider principles of auditory perception in terms of psychoacoustics and auditory scene analysis to produce Perceptually Congruent Sonification (PerCS). Even though PerCS did not significantly improve measured performance or subjective difficulty interpreting auditory line charts, their guidelines certainly help sonification designers choose perceptually meaningful parameters and ranges that may improve interpretability in more complex sonification scenarios. This paper is a reworked and substantially extended version of their presentation at the International Conference on Auditory Display (ICAD 2023) [14].

With this special issue, we hope that these articles will offer novel, interesting perspectives on the field and inspire the readers’ research.