Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3478384.3478387acmotherconferencesArticle/Chapter ViewAbstractPublication PagesamConference Proceedingsconference-collections
research-article

Sonification for Conveying Data and Emotion

Published: 15 October 2021 Publication History

Abstract

In the present study a sonification of running data was evaluated. The aim of the sonification was to both convey information about the data and convey a specific emotion. The sonification was evaluated in three parts, firstly as an auditory graph, secondly together with additional text information, and thirdly together with an animated visualization, with a total of 150 responses. The results suggest that the sonification could convey an emotion similar to that intended, but at the cost of less good representation of the data. The addition of visual information supported understanding of the sonification, and the auditory representation of data. The results thus suggest that it is possible to design sonification that is perceived as both interesting and fun, and convey an emotional impression, but that there may be a trade off between musical experience and clarity in sonification.

References

[1]
Mark Ballora. 2014. Sonification, Science and Popular Music: In search of the ’wow’. Organised Sound 19, 1 (2014), 30–40.
[2]
Stephen Barrass. 2012. The aesthetic turn in sonification towards a social and cultural medium. AI & Society 27, 2 (2012), 177–181.
[3]
Janet Best. 2017. Colour Design: Theories and Applications(2nd ed.). Elsevier Ltd.: Woodhead Publishing, Duxford, United Kingdom.
[4]
Roberto Bresin. 2005. What is the color of that music performance?. In Proc. International Computer Music Conference (ICMC2005). International Computer Music Association, San Francisco, CA, 367–370.
[5]
Malcolm Budd. 2002. Music and the emotions: The philosophical theories. Routledge, Abingdon, UK.
[6]
William G. Collier and Timothy L. Hubbard. 2004. Musical scales and brightness evaluations: Effects of pitch, direction, and scale mode. Musicae Scientiae 8(2004), 151–173.
[7]
David Creasey. 2017. Audio Processes: Musical Analysis, Modification, Synthesis, and Control. Taylor & Francis, New York, NY, USA.
[8]
William B. Davis, Kate E. Gfeller, and Michael H. Thaut. 2008. An introduction to music therapy: Theory and practice. ERIC, Silver Spring, MD, USA.
[9]
Iréne Deliége and John Sloboda. 1997. Perception and Cognition of Music. Psychology Press Ltd., Hove, East Susse.
[10]
Gaël Dubus and Roberto Bresin. 2013. A systematic review of mapping strategies for the sonification of physical quantities. PloS ONE 8, 12 (2013), 1–28.
[11]
Alexander J. Ellis. 1884. Musical scales of various nations. RSA Journal 33(1884), 485.
[12]
Lisa Feldman Barrett and James A. Russell. 1998. Independence and bipolarity in the structure of current affect.Journal of Personality and Social Psychology 74, 4(1998), 967.
[13]
John H. Flowers. 2005. Thirteen years of reflection on auditory graphing: Promises, pitfalls, and potential new directions. In Proc. 11th International Conference on Auditory Display (ICAD2005). Georgia Institute of Technology, Limerick, Ireland, 406–409.
[14]
John H. Flowers, Dion C. Buhman, and Kimberly D. Turnage. 1997. Cross-Modal Equivalence of Visual and Auditory Scatterplots for Exploring Bivariate Data Samples. Human Factors 39(1997), 341–351. Issue 3.
[15]
John H. Flowers, Dion C. Buhman, and Kimberly D. Turnage. 2005. Data Sonification from the Desktop: Should Sound Be Part of Standard Data Analysis Software?ACM Transactions on Applied Perception 2 (2005), 467–472. Issue 4.
[16]
Karmen Franinovic and Stefania Serafin. 2013. Sonic Interaction Design. MIT Press, Cambridge, MA, USA.
[17]
Alf Gabrielsson and Erik Lindström. 2010. The role of structure in the musical expression of emotions. Handbook of music and emotion: Theory, research, applications 367400 (2010), 367–44.
[18]
Scot Gresham-Lancaster. 2012. Relationships of sonification to music and sound art. AI & Society 27, 2 (2012), 207–212.
[19]
Florian Grond and Thomas Hermann. 2014. Interactive Sonification for Data Exploration: How listening modes and display purposes define design guidelines. Organised Sound 19, 1 (2014), 41.
[20]
Thomas Hermann. 2008. Taxonomy and definitions for sonification and auditory display. In Proc. 14thInternational Conference on Auditory Display (ICAD2008). Georgia Institute of Technology, Paris, France, 1–8.
[21]
Thomas Hermann and Andy Hunt. 2004. The Discipline of Interactive Sonification. In Proc. 1st Interactive Sonification Workshop (ISON2004). Bielefeld University, Germany, 1–9.
[22]
Thomas Hermann, Andy Hunt, and John G. Neuhoff. 2011. The Sonification Handbook(1st ed.). Logos Publishing House, Berlin, Germany.
[23]
Tobias Hildebrandt, Felix Amerbauer, and Stefanie Rinderle-Ma. 2016. Combining Sonification and Visualization for the Analysis of Process Execution Data. In Proc. 18th Conference on Business Informatics (CBI), Vol. 2. IEEE, Paris, France, 32–37.
[24]
Andy Hunt and Thomas Hermann. 2004. The importance of interaction in sonification. In Proc. 10th International Conference on Auditory Display (ICAD2004). Georgia Institute of Technology, Sydney, Australia, 1–8.
[25]
Stefanos A. Iakovides, Vassiliki M. Iliadou, Vassiliki T. H. Bizeli, Stergios Kaprinis, Konstantinos Fountoulakis, and George S. Kaprinis. 2004. Psychophysiology and psychoacoustics of music: Perception of complex sound in normal subjects and psychiatric patients. Annals of General Hospital Psychiatry 3 (2004), 1–4. Issue 6.
[26]
Patrik N. Juslin and Petri Laukka. 2003. Communication of emotions in vocal expression and music performance: Different channels, same code?Psychological Bulletin 129, 5 (2003), 770.
[27]
Patrik. N. Juslin and Petri Laukka. 2004. Expression, Perception, and Induction of Musical Emotions: A Review and a Questionnaire Study of Everyday Listening. Journal of New Music Research 33 (2004), 217–238. Issue 3.
[28]
Patrik N. Juslin and Renee Timmers. 2010. Expression and communication of emotion in music performance. Oxford University Press, Oxford, UK, Chapter 17, 453–489.
[29]
Hans G. Kaper, Elizabeth Wiebel, and Sever Tipei. 1999. Data sonification and sound visualization. Computing in science & engineering 1, 4 (1999), 48–58.
[30]
M. Kasakevich, P. Boulanger, W. F. Bischof, and M. Garcia. 2007. Augmentation of Visualisation Using Sonification: A Case Study in Computational Fluid Dynamics. In Proc. IPT-EGVE Symposium. The Eurographics Association, Germany, Europe, 89–94.
[31]
Stefan Koelsch. 2009. Neural substrates of processing syntax and semantics in music. In Music that works. Springer, Wien, Austria, 143–153.
[32]
Gregory Kramer, Bruce Walker, Terri Bonebright, Perry Cook, John H. Flowers, Nadine Miner, and John Neuhoff. 2010. Sonification Report: Status of the Field and Research Agenda. Faculty Publications, Department of Psychology 444 (2010), 1–29.
[33]
Guillaume Lemaitre, Olivier Houix, Patrick Susini, Yon Visell, and Karmen Franinović. 2012. Feelings elicited by auditory feedback from a computationally augmented artifact: The flops. IEEE Transactions on Affective Computing 3, 3 (2012), 335–348.
[34]
Daniel J. Levitin. 2006. This is your brain on music: The science of a human obsession. Dutton/Penguin Books, New York, NY, US.
[35]
Ying Liu, Guangyuan Liu, Dongtao Wei, Qiang Li, Guangjie Yuan, Shifu Wu, Gaoyuan Wang, and Xingcong Zhao. 2018. Effects of musical tempo on musicians’ and non-musicians’ emotional experience when listening to music. Frontiers in Psychology 9 (2018), 2118.
[36]
Lawrence E. Marks. 1987. On cross-modal similarity: Auditory-visual interactions in speeded discrimination. Journal of Experimental Psychology: Human Perception and Performance 13, 3(1987), 384–394.
[37]
James McCartney. 2002. Rethinking the computer music language: SuperCollider. IEEE Computer Graphics & Applications 26 (2002), 61–68. Issue 4.
[38]
Oussama Metatla, Nick Bryan-Kinns, Tony Stockman, and Fiore Martin. 2016. Sonification of reference markers for auditory graphs: effects on non-visual point estimation tasks. PeerJ Computer Science 2(2016), e51.
[39]
Leonard B. Meyer. 1956. Emotion and meaning in music. University of Chicago Press, Chicago, USA.
[40]
Yuki Nakayama, Yuji Takano, Masaki Matsubara, Kenji Suzuki, and Hiroko Terasawa. 2017. The sound of smile: Auditory biofeedback of facial EMG activity. Displays 47(2017), 32–39.
[41]
Michael A. Nees. 2018. Auditory graphs are not the ”killer app” of sonification, but they work. Ergonomics in Design 26, 4 (2018), 25–28.
[42]
Keith V. Nesbitt and Stephen Barrass. 2002. Evaluation of a Multimodal Sonification and Visualisation of Depth of Market Stock Data. In Proc. 8th International Conference on Auditory Display (ICAD2002). Georgia Institute of Technology, Kyoto, Japan, 2–5.
[43]
John G. Neuhoff. 2019. Is sonification doomed to fail?. In Proc. 25th International Conference on Auditory Display (ICAD2019). Georgia Institute of Technology, Newcastle upon Tyne, UK, 327–330.
[44]
Karen J. Pallesen, Elvira Brattico, Christopher Bailey, Antti Korvenoja, Juha Koivisto, Albert Gjedde, and Synnöve Carlson. 2005. Emotion Processing of Major, Minor,and Dissonant Chords: A Functional Magnetic Resonance Imaging Study. Annals New York Academy of Sciences 1060 (2005), 450–453.
[45]
Stephen E. Palmer, Thomas A. Langlois, and Karen B. Schloss. 2016. Music-to-Color Associations of Single-Line Piano Melodies in Non-synesthetes. Multisensory Research 29(2016), 157–193.
[46]
Renato Panda, Ricardo Manuel Malheiro, and Rui Pedro Paiva. 2020. Audio features for music emotion recognition: a survey. IEEE Transactions on Affective Computing(2020), 1–20.
[47]
Ernst Pauer. 1877. The Elements of the beautiful in Music. Cornell University Library, Ithaca, NY, USA.
[48]
Lotte Philipsen and Rikke S. Kjærgaard. 2018. The Aesthetics of Scientific Data Representation: More than Pretty Pictures. Routledge. Routledge Advances in Art and Visual Studies., Denmark, Europe.
[49]
Trevor Pinch and Karin Bijsterveld. 2012. The Oxford Handbook of Sound Studies. Oxford University Press, Oxford, UK.
[50]
Ralph W. Pridmore. 1992. Music and color: Relations in the psychophysical perspective. Color Research & Application 17 (1992), 57–61. Issue 1.
[51]
Benjamin Rau, Florian Frieß, Michael Krone, Chistoph üller, and Thomas Ertl. 2015. Enhancing Visualization of Molecular Simulations using Sonification. In Proc. 1st International Workshop on Virtual and Augmented Reality for Molecular Science (VARMS@IEEEVR2015). The Eurographics Association, Arles, France, 25–30.
[52]
Eckard Riedenklau, Thomas Hermann, and Helge Ritter. 2010. Tangible Active Objects and Interactive Sonification as a Scatter Plot Alternative for the Visually Impaired. In Proc. 16th International Conference on Auditory Display (ICAD2010). Georgia Institute of Technology, Washington, DC, USA, 1–7.
[53]
Alexander Rind, Michael Iber, and Wolfgang Aigner. 2018. Bridging the Gap Between Sonification and Visualization. In Proc. MultimodalVis’18 Workshop at AVI 2018. ACM, Grosseto, Italy, 1–4.
[54]
Gareth E. Roberts. 2016. From Music to Mathematics: Exploring the Connections. Johns Hopkins University Press, Baltimore.
[55]
Niklas Rönnberg. 2019. Musical sonification supports visual discrimination of color intensity. Behaviour & Information Technology 38, 10 (2019), 1028–1037.
[56]
Niklas Rönnberg. 2019. Sonification supports perception of brightness contrast. Journal on Multimodal User Interfaces 13 (7 2019), 373–381. Issue 4.
[57]
Niklas Rönnberg and Jonas Löwgren. 2016. The Sound Challenge to Visualization Design Research. In Proc. EmoVis 2016, ACM IUI 2016 Workshop on Emotion and Visualization, Vol. 103. Linköping Electronic Conference Proceedings, Sweden, 31–34.
[58]
Niklas Rönnberg and Jonas Löwgren. 2021. Designing the user experience of musical sonification in public and semi-public spaces. SoundEffects 10, 1 (2021), 125–141.
[59]
James A. Russell. 1980. A circumplex model of affect. Journal of Personality and Social Psychology 39, 6(1980), 1161.
[60]
Carla Scaletti. 2018. Sonification ≠ Music. In The Oxford Handbook of Algorithmic Music. Oxford University Press, Oxford, UK.
[61]
Margaret Schedel and David R. Worrall. 2014. Editorial. Organised Sound 19, 1 (2014), 1––3.
[62]
Manfred R. Schroeder and Benjamin F. Logan. 1961. ”Colorless” artificial reverberation. IRE Transactions on Audio 9, 6 (1961), 209–214.
[63]
Emery Schubert. 1999. Measurement and time series analysis of emotion in music. Ph.D. Dissertation. University of New South Wales, Sydney, Australia.
[64]
Emery Schubert, Sam Ferguson, Natasha Farrar, and Gary E. McPherson. 2011. Sonification of emotion I: Film music. In Proc. 17th International Conference on Auditory Display (ICAD2011). Georgia Institute of Technology, Budapest, Hungary, 1–8.
[65]
Carl E. Seashore. 1967. Psychology of Music. Dover, New York, NY, US.
[66]
William A. Sethares. 2005. Tuning, Timbre, Spectrum, Scale(2nd ed.). Springer, London.
[67]
Tony Stockman, Louise Valgerour Nickerson, and Greg Hind. 2005. Auditory graphs: A summary of current experience and towards a research agenda. In Proc. 11th International Conference on Auditory Display (ICAD2005). Georgia Institute of Technology, Limerick, Ireland, 420–422.
[68]
Jeffery J. Summers, David A. Rosenbaum, Bruce D. Burns, and Stephen K. Ford. 1993. Production of polyrhythms. Journal of Experimental Psychology: Human Perception and Performance 19, 2(1993), 416–428.
[69]
Brianna J. Tomlinson, Brittany E. Noah, and Bruce N. Walker. 2018. Buzz: An auditory interface user experience scale. In Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems. ACM, Montréal, Canada, 1–6.
[70]
Takahiko Tsuchiya, Jason Freeman, and Lee W. Lerner. 2006. Data-to-music API: Real-time data-agnostic sonification with musical structure models. In Proc. 21st International Conference on Auditory Display (ICAD2015). Georgia Institute of Technology, Graz, Styria, Austria, 244–251.
[71]
Paul Vickers. 2016. Sonification and music, music and sonification. In The Routledge Companion to Sounding Art. Taylor & Francis, New York, NY, USA, 135–144.
[72]
Paul Vickers and Bennett Hogg. 2006. Sonification Abstraite/Sonification Concrète: An ’Æsthetic Perspective Space’ for Classifying Auditory Displays in the Ars Musica Domain. In Proc. 12thInternational Conference on Auditory Display (ICAD2006). Georgia Institute of Technology, London, UK, 210–216.
[73]
Christina Volioti, Stelios Hadjidimitriou, Sotiris Manitsaris, Leontios Hadjileontiadis, Vasileios Charisis, and Athanasios Manitsaris. 2016. On mapping emotional states and implicit gestures to sonification output from the ’Intangible Musical Instrument’. In Proc. 3rd International Symposium on Movement and Computing. ACM, New York, NY, US, 1–5.
[74]
Jamie Ward, Brett Huckstep, and Elias Tsakanikos. 2006. Sound-colour synaesthesia: To what extent does it use cross-modal mechanisms common to us all?Cortex 42(2006), 264–280.
[75]
Fritz Winckel. 1967. Music, Sound and Sensation: A Modern Exposition. Dover Publications, Inc., New York, NY, US.
[76]
R. Michael Winters and Marcelo M. Wanderley. 2014. Sonification of Emotion: Strategies and results from the intersection with music. Organised Sound 19, 1 (2014), 60–69.
[77]
Eberhard Zwicker and Hugo Fastl. 2013. Psychoacoustics: Facts and models. Vol. 22. Springer Science & Business Media, Berlin/Heidelberg, Germany.

Cited By

View all
  • (2024)Open Your Ears and Take a Look: A State‐of‐the‐Art Report on the Integration of Sonification and VisualizationComputer Graphics Forum10.1111/cgf.1511443:3Online publication date: 10-Jun-2024
  • (2024)RainMind: Investigating Dynamic Natural Soundscape of Physiological Data to Promote Self-Reflection for Stress ManagementInternational Journal of Human–Computer Interaction10.1080/10447318.2024.2364468(1-18)Online publication date: 27-Jun-2024
  • (2024)Audio augmented reality using sonification to enhance visual art experiencesInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2024.103329191:COnline publication date: 1-Nov-2024
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Other conferences
AM '21: Proceedings of the 16th International Audio Mostly Conference
September 2021
283 pages
ISBN:9781450385695
DOI:10.1145/3478384
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 15 October 2021

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. model of affect
  2. sonification
  3. user evaluation
  4. visualization

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Conference

AM '21
AM '21: Audio Mostly 2021
September 1 - 3, 2021
virtual/Trento, Italy

Acceptance Rates

Overall Acceptance Rate 177 of 275 submissions, 64%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)61
  • Downloads (Last 6 weeks)5
Reflects downloads up to 16 Dec 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Open Your Ears and Take a Look: A State‐of‐the‐Art Report on the Integration of Sonification and VisualizationComputer Graphics Forum10.1111/cgf.1511443:3Online publication date: 10-Jun-2024
  • (2024)RainMind: Investigating Dynamic Natural Soundscape of Physiological Data to Promote Self-Reflection for Stress ManagementInternational Journal of Human–Computer Interaction10.1080/10447318.2024.2364468(1-18)Online publication date: 27-Jun-2024
  • (2024)Audio augmented reality using sonification to enhance visual art experiencesInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2024.103329191:COnline publication date: 1-Nov-2024
  • (2023)Characterizing the visualization design space of distant and close reading of poetic rhythmFrontiers in Big Data10.3389/fdata.2023.11677086Online publication date: 6-Jun-2023
  • (2023)“Foggy sounds like nothing” — enriching the experience of voice assistants with sonic overlaysPersonal and Ubiquitous Computing10.1007/s00779-023-01722-327:5(1927-1947)Online publication date: 6-Jun-2023

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media