Abstract
We present here a mobile phone application called MoodifierLive which aims at using expressive music performances for the sonification of expressive gestures through the mapping of the phone’s accelerometer data to the performance parameters (i.e. tempo, sound level, and articulation). The application, and in particular the sonification principle, is described in detail. An experiment was carried out to evaluate the perceived matching between the gesture and the music performance that it produced, using two distinct mappings between gestures and performance. The results show that the application produces consistent performances, and that the mapping based on data collected from real gestures works better than one defined a priori by the authors.
Similar content being viewed by others
References
Barrass S, Kramer G (1999) Using sonification. Multimed Syst 7:23–31
Bevilacqua F, Zamborlin B, Sypniewski A, Schnell N, Guédy F, Rasamimanana N (2010) Continuous realtime gesture following and recognition. In: Kopp S, Wachsmuth I (eds) Gesture in embodied communication and human-computer interaction. Lecture notes in artificial intelligence, vol 5934. Springer, Heidelberg, pp 73–84
Bresin R, Friberg A (2000) Emotional coloring of computer-controlled music performances. Comput Music J 24(4):44–63
Bresin R, Friberg A (2011) Emotion rendering in music: range and characteristic values of seven musical variables. Cortex 47(9):1068–1081
Camurri A, Volpe G, Vinet H, Bresin R, Maestre E, Javier L, Kleimola J, Välimäki V, Seppanen J (2009) User-centric context-aware mobile applications for embodied music listening. In: Proceedings of the 1st international ICST conference on user centric media
Dahl S, Friberg A (2007) Visual perception of expressiveness in musicians’ body movements. Music Percept 24(5):433–454
Fabiani M, Dubus G, Bresin R (2010) Interactive sonification of emotionally expressive gestures by means of music performance. In: Bresin R, Hermann T, Hunt A (eds) Proceedings of ISon—interactive sonification workshop. Stockholm, Sweden
Fabiani M, Dubus G, Bresin R (2011) MoodifierLive: Interactive and collaborative music performance on mobile devices. In: Proceedings of the international conference on new interfaces for musical expression (NIME11), Oslo, Norway
Fabiani M, Friberg A, Bresin R (forthcoming) Systems for interactive control of computer generated music performance. In: Kirke A, Miranda E (eds) Computer systems for expressive music performance. Springer, Berlin
Friberg A (2005) Home conducting: control the overall musical expression with gestures. In: Proceedings of the international computer music conference (ICMC2005), Barcelona, Spain, pp 479–482
Friberg A (2006) pDM: an expressive sequencer with real-time control of the KTH music-performance rules. Comput Music J 30(1):37–48
Friberg A, Bresin R, Sundberg J (2006) Overview of the KTH rule system for musical performance. Adv Cogn Psychol 2(2–3):145–161 (Special issue on music performance)
Friberg A, Colombo V, Fryden L, Sundberg J (2000) Generating musical performances with director musices. Comput Music J 24:23–29
Godøy RI, Leman M (eds) (2009) Musical gestures: sound, movement, and meaning. Routledge, London
Hermann T (2008) Taxonomy and definitions for sonification and auditory display. In: Proceedings of the 14th international conference on auditory display. Paris, France
Juslin PN, Friberg A, Bresin R (2002) Toward a computational model of expression in performance: The GERM model. Musicae Scientiae (Special issue) 2001–2002, pp. 63–122
Juslin PN, Timmers R (2010) Expression and communication of emotion in music performance. In: Juslin PN, Sloboda JA (eds) Handbook of music and emotion: theory, research, applications. Oxford University Press, Oxford, pp 453–489
Mancini M, Varni G, Kleimola J, Volpe G, Camurri A (2010) Human movement expressivity for mobile active music listening. J Multimodal User Interfaces 4:27–35
Mathews MV (1989) The conductor program and mechanical baton. In: Current directions in computer music research. MIT Press, Cambridge
Pollick FE, Paterson HM, Bruderlin A, Sanford AJ (2001) Perceiving affect from arm movement. Cognition 82(2):B51–B61
Russell JA (1980) A circumplex model of affect. J Pers Soc Psychol 39:1161–1178
Schneider K, Zernicke RF (1989) Jerk-cost modulation during the practice of rapid arm movements. Biol Cybern 60(3):221–230
Varni G, Oksanen S, Dubus G, Volpe G, Fabiani M, Bresin R, Välimäki V, Kleimola J (2011, forthcoming) Interactive sonification of synchronization of motoric behavior in social active listening of music with mobile devices. J Multimodal User Interfaces (Special issue on interactive sonification)
Widmer G, Goebl W (2004) Computational models of expressive music performance: the state of the art. J New Music Res 33(3):203–216
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Fabiani, M., Bresin, R. & Dubus, G. Interactive sonification of expressive hand gestures on a handheld device. J Multimodal User Interfaces 6, 49–57 (2012). https://doi.org/10.1007/s12193-011-0076-2
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12193-011-0076-2