Avoid common mistakes on your manuscript.
1 Introduction
How relevant was your interaction to the sounds you heard today? If you can’t answer it in less than one second, it is, because we process sound as result of our interactions so ubiquitously and so effortlessly that we stop being aware of how versatile, informative and enriching these closed-loop auditory interactions are. Every footstep, every bodily movement, every interaction with physical objects rewards us with sounds whose features in turn guide our next action.
Today’s technology allows us to integrate such seamless sonic interactions in situations where data is to be involved, where sonification can lever our understanding of the world around us and turn data inspection and operation in the world, for instance for people with sensory deficits or otherwise occupied visual perception.
One of the main characteristics of sound is its inherent time dependence and evolution, making it a first class information channel for supporting and augmenting interaction between humans and machines for example in displays implying selection, manipulation, or control of data. In recent years, it has become clear that there is an important need for research to address the interaction with auditory displays more explicitly. Interactive Sonification is the specialized research topic concerned with the use of sound to portray data, but where there is a human being at the heart of an interactive control loop. Specifically it deals with the following areas (but not limited to), in which we invite submissions of research papers:
-
interfaces between humans and auditory displays
-
mapping strategies and models for creating coherency between action and reaction (e.g. acoustic feedback, but also combined with haptic or visual feedback)
-
perceptual aspects of the display (how to relate actions and sound, e.g. cross-modal effects, importance of synchronization)
-
applications of Interactive Sonification
-
evaluation of performance, usability and multimodal interactive systems including auditory feedback
For this special issue, we encouraged researchers to submit works focusing on Adaptivity and Scaffolding in Interactive sonification, i.e. how auditory feedback and interactive sonification provide a scaffolding for familiarizing with interaction and learning to interact, and how users adapt their activity patterns according to the feedback and their level of experience. For example, in a sports movement sonification the user’s attention could initially be directed on basic patterns such as the movements of the active arm, and once the user progresses, i.e. feedback indicates that the user understands and utilizes this information, the display could add increasingly subtle cues (e.g. information on the knees) by making such auditory streams more salient.
This feeds into the important research question: how we can evaluate the complex and temporally developing interrelationship between the human user and an interactive system that is coupled to the user by means of Interactive Sonification. To make a sustainable contribution, we strongly encouraged a reproducible research approach in Interactive Sonification to (a) allow for the formal evaluation and comparison of Interactive Sonification systems, (b) establish standards in Interactive Sonification.
This Special Issue gathers contributions in the field of Interactive Sonification. Some of the publications are extended versions of papers presented at the 5th ISon—Interactive Sonification workshop 2016Footnote 1, Bielefeld University, Germany. The ISon workshop is a single-track workshop with special focus on Interactive Sonification, and it has been organized every third year since 2004, when the first ISon was organized in Bielefeld (please visit https://interactive-sonification.org for information about all previous editions of ISon).
2 In this issue
Overall, eight submissions were selected to be published in the special issue that will hopefully help to advance the field of interactive sonification. They focus on three application areas: (1) real-time sonification of human body movement, (2) sonification assisted navigation, and (3) artistic sonification in musical expression and performance.
2.1 Movement sonification
Maes, Lorenzoni and Six, in their SoundBike project, looked into the affect of additional auditory cues of cyclists’ motor rhythm to synchronization of pedaling cadence and external rhythmic music [1]. The experimental results are that the designed sonification had an effect on tempo regulation, which can lead to more consistent cadence and power output. A discussion is also explored on the possibility as a novel expressive musical interface.
Lorenzoni et. al. proposed a music augmentation sonification that influences the quality of music based on a runners’ gait information [2]. The degradation methods include introducing noise, downsampling and volume decrease. The study finds that the embodied sonification helped participants regulate their pace.
Frid, Elblaus and Bresin, through a series of experiments, explored the auditory externalisation of fluid movements, from using parametric sonification to free vocal sketching, and conducted a workshop for participants to compare between fluid and nonfluid sounds [3]. They concluded that a fluid sound tends to situate in the lower spectral territory, the sound is also more continuous and slow (think of wind and water sounds). Then nonfluid sound tend to possess the opposite qualities. The authors provide design guidelines for sonically externalising fluidity movements.
Niewiadomski et. al. studied the recognition of movement qualities by means of auditory display [4]. In their study, they sonified movement fragility and lightness and compared them in an embodied and non-embodied condition. The authors find that the two movement-expressive qualities can be perceived through sonification only; and with an embodied training, in which the participants experienced the sonification of their own movement qualities, those participants were more capable of perceiving the sonification of fragility.
2.2 Sonification aided navigation
In Ziemer and Schultheis’ paper [5], an interactive sonification approach to two-dimensional space navigation designed is presented. The main idea is to map orthogonal spatial dimensions to auditory properties that are also considered as orthogonal, creating a pyschoacoustic linkage with the visual domain. A mouse movement experiment shows that participants were able to navigate to a given target location with the assistance of the auditory display under a relatively short training session. The paper prompts the possibility of many applications, but particularly for surgical operation navigation.
The paper from Skulimowski et. al. also tapped into using interactive sonification to aid navigation but in a 3D scene, aiming towards helping visually impaired people [6]. The system uses a depth sensor to segment depth images and convert them into sound in real time. The paper presents a study in which visually impaired participants successfully navigated through an indoor space with the use of the system prototype.
2.3 User-driven artistic sonification
Musical Vision is a tool developed by Polo and Sevillano [7] that generates harmonic polyphonic music based on color images. The authors proposed a bio-inspired approach to the sonification, which is based on our vision system’s field of view and our photoreceptor cells (cones). The system allows the conversion of painting into music of different length and it can be used for example for augmenting vision.
Wolf and Fiebrink presented a platform which allows audiences of a musical performance to interact with the performance by sonifying the movements produced by musicians while playing an air-drum [8]. The bidirectional and personalised approach promotes a stronger enjoyment and engaging experience. The paper also explored the factors that motivate audience members’ engagement.
References
Maes PJ, Lorenzoni V, Six J (2018) The soundbike: musical sonification strategies to enhance cyclists’ spontaneous synchronization to external music. J Multimodal User Interfaces. https://doi.org/10.1007/s12193-018-0279-x
Lorenzoni V, Van den Berghe P, Maes P-J, De Bie T, De Clercq D, Leman M (2018) Design and validation of an auditory biofeedback system for modification of running parameters. J Multimodal User Interfaces. https://doi.org/10.1007/s12193-018-0283-1
Frid E, Elblaus L, Bresin R (2018) Interactive sonification of a fluid dance movement: an exploratory study. J Multimodal User Interfaces. https://doi.org/10.1007/s12193-018-0278-y
Niewiadomski R, Mancini M, Cere A, Piana S, Canepa C, Camurri A (2018) Does embodied training improve the recognition of mid-level expressive movement qualities sonification? J Multimodal User Interfaces. https://doi.org/10.1007/s12193-018-0284-0
Ziemer T, Schultheis H (2018) Psychoacoustic auditory display for navigation: an auditory assistance system for spatial orientation tasks. J Multimodal User Interfaces. https://doi.org/10.1007/s12193-018-0282-2
Skulimowski P, Owczarek M, Radecki A, Bujacz M, Rzeszotarski D, Strumillo P (2018) Interactive sonification of u-depth images in a navigation aid for the visually impaired. J Multimodal User Interfaces. https://doi.org/10.1007/s12193-018-0281-3
Polo A, Sevillano X (2018) Musical vision: an interactive bio-inspired sonification tool to convert images into music. J Multimodal User Interfaces. https://doi.org/10.1007/s12193-018-0280-4
Wolf KAE, Fiebrink R (2018) Personalised interactive sonification of musical performance data. J Multimodal User Interfaces. https://doi.org/10.1007/s12193-019-00294-y
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Yang, J., Hermann, T. & Bresin, R. Introduction to the special issue on interactive sonification. J Multimodal User Interfaces 13, 151–153 (2019). https://doi.org/10.1007/s12193-019-00312-z
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12193-019-00312-z