Nothing Special   »   [go: up one dir, main page]

Skip to main content
Log in

Interactive sonification of expressive hand gestures on a handheld device

  • Original Paper
  • Published:
Journal on Multimodal User Interfaces Aims and scope Submit manuscript

Abstract

We present here a mobile phone application called MoodifierLive which aims at using expressive music performances for the sonification of expressive gestures through the mapping of the phone’s accelerometer data to the performance parameters (i.e. tempo, sound level, and articulation). The application, and in particular the sonification principle, is described in detail. An experiment was carried out to evaluate the perceived matching between the gesture and the music performance that it produced, using two distinct mappings between gestures and performance. The results show that the application produces consistent performances, and that the mapping based on data collected from real gestures works better than one defined a priori by the authors.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Barrass S, Kramer G (1999) Using sonification. Multimed Syst 7:23–31

    Article  Google Scholar 

  2. Bevilacqua F, Zamborlin B, Sypniewski A, Schnell N, Guédy F, Rasamimanana N (2010) Continuous realtime gesture following and recognition. In: Kopp S, Wachsmuth I (eds) Gesture in embodied communication and human-computer interaction. Lecture notes in artificial intelligence, vol 5934. Springer, Heidelberg, pp 73–84

    Chapter  Google Scholar 

  3. Bresin R, Friberg A (2000) Emotional coloring of computer-controlled music performances. Comput Music J 24(4):44–63

    Article  Google Scholar 

  4. Bresin R, Friberg A (2011) Emotion rendering in music: range and characteristic values of seven musical variables. Cortex 47(9):1068–1081

    Article  Google Scholar 

  5. Camurri A, Volpe G, Vinet H, Bresin R, Maestre E, Javier L, Kleimola J, Välimäki V, Seppanen J (2009) User-centric context-aware mobile applications for embodied music listening. In: Proceedings of the 1st international ICST conference on user centric media

    Google Scholar 

  6. Dahl S, Friberg A (2007) Visual perception of expressiveness in musicians’ body movements. Music Percept 24(5):433–454

    Article  Google Scholar 

  7. Fabiani M, Dubus G, Bresin R (2010) Interactive sonification of emotionally expressive gestures by means of music performance. In: Bresin R, Hermann T, Hunt A (eds) Proceedings of ISon—interactive sonification workshop. Stockholm, Sweden

    Google Scholar 

  8. Fabiani M, Dubus G, Bresin R (2011) MoodifierLive: Interactive and collaborative music performance on mobile devices. In: Proceedings of the international conference on new interfaces for musical expression (NIME11), Oslo, Norway

    Google Scholar 

  9. Fabiani M, Friberg A, Bresin R (forthcoming) Systems for interactive control of computer generated music performance. In: Kirke A, Miranda E (eds) Computer systems for expressive music performance. Springer, Berlin

  10. Friberg A (2005) Home conducting: control the overall musical expression with gestures. In: Proceedings of the international computer music conference (ICMC2005), Barcelona, Spain, pp 479–482

    Google Scholar 

  11. Friberg A (2006) pDM: an expressive sequencer with real-time control of the KTH music-performance rules. Comput Music J 30(1):37–48

    Article  Google Scholar 

  12. Friberg A, Bresin R, Sundberg J (2006) Overview of the KTH rule system for musical performance. Adv Cogn Psychol 2(2–3):145–161 (Special issue on music performance)

    Article  Google Scholar 

  13. Friberg A, Colombo V, Fryden L, Sundberg J (2000) Generating musical performances with director musices. Comput Music J 24:23–29

    Article  Google Scholar 

  14. Godøy RI, Leman M (eds) (2009) Musical gestures: sound, movement, and meaning. Routledge, London

    Google Scholar 

  15. Hermann T (2008) Taxonomy and definitions for sonification and auditory display. In: Proceedings of the 14th international conference on auditory display. Paris, France

    Google Scholar 

  16. Juslin PN, Friberg A, Bresin R (2002) Toward a computational model of expression in performance: The GERM model. Musicae Scientiae (Special issue) 2001–2002, pp. 63–122

  17. Juslin PN, Timmers R (2010) Expression and communication of emotion in music performance. In: Juslin PN, Sloboda JA (eds) Handbook of music and emotion: theory, research, applications. Oxford University Press, Oxford, pp 453–489

    Google Scholar 

  18. Mancini M, Varni G, Kleimola J, Volpe G, Camurri A (2010) Human movement expressivity for mobile active music listening. J Multimodal User Interfaces 4:27–35

    Article  Google Scholar 

  19. Mathews MV (1989) The conductor program and mechanical baton. In: Current directions in computer music research. MIT Press, Cambridge

    Google Scholar 

  20. Pollick FE, Paterson HM, Bruderlin A, Sanford AJ (2001) Perceiving affect from arm movement. Cognition 82(2):B51–B61

    Article  Google Scholar 

  21. Russell JA (1980) A circumplex model of affect. J Pers Soc Psychol 39:1161–1178

    Article  Google Scholar 

  22. Schneider K, Zernicke RF (1989) Jerk-cost modulation during the practice of rapid arm movements. Biol Cybern 60(3):221–230

    Article  Google Scholar 

  23. Varni G, Oksanen S, Dubus G, Volpe G, Fabiani M, Bresin R, Välimäki V, Kleimola J (2011, forthcoming) Interactive sonification of synchronization of motoric behavior in social active listening of music with mobile devices. J Multimodal User Interfaces (Special issue on interactive sonification)

  24. Widmer G, Goebl W (2004) Computational models of expressive music performance: the state of the art. J New Music Res 33(3):203–216

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Marco Fabiani.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Fabiani, M., Bresin, R. & Dubus, G. Interactive sonification of expressive hand gestures on a handheld device. J Multimodal User Interfaces 6, 49–57 (2012). https://doi.org/10.1007/s12193-011-0076-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12193-011-0076-2

Keywords

Navigation