Nothing Special   »   [go: up one dir, main page]

skip to main content
research-article

Perception and replication of planar sonic gestures

Published: 22 October 2012 Publication History

Abstract

As tables, boards, and walls become surfaces where interaction can be supported by auditory displays, it becomes important to know how accurately and effectively a spatial gesture can be rendered by means of an array of loudspeakers embedded in the surface. Two experiments were designed and performed to assess: (i) how sequences of sound pulses are perceived as gestures when the pulses are distributed in space and time along a line; (ii) how the timing of pulses affects the perceived and reproduced continuity of sequences; and (iii) how effectively a second parallel row of speakers can extend sonic gestures to a two-dimensional space. Results show that azimuthal trajectories can be effectively replicated and that switching between discrete and continuous gestures occurs within the range of inter-pulse interval from 75 to 300ms. The vertical component of sonic gestures cannot be reliably replicated.

References

[1]
Adcock, M. and Barrass, S. 2004. Cultivating design patterns for auditory displays. In Proceedings of the International Conference on Auditory Display.
[2]
Andersen, T. H. and Zhai, S. 2008. “Writing with music”: Exploring the use of auditory feedback in gesture interfaces. ACM Trans. Appl. Percept. 7, 17:1--17:24.
[3]
Bakker, S., van den Hoven, E., and Eggen, B. 2010. Exploring interactive systems using peripheral sounds. In Haptic and Audio Interaction Design, R. Nordahl et al. eds., Lecture Notes in Computer Science, vol. 6306. Springer, Berlin, 55--64.
[4]
Begault, D. R., Anderson, M. R., and McClain, M. U. 2006. Spatially modulated auditory alerts for aviation. J. Audio Eng. Soc 54, 4, 276--282.
[5]
Bremer, C. D., Pittenger, J. B., Warren, R., and Jenkins, J. J. 1977. An illusion of auditory saltation similar to the cutaneous “rabbit”. Amer. J. Psychol. 90, 4, 645--654.
[6]
Brown, L. and Brewster, S. 2003. Drawing by ear: Interpreting sonified line graphs. In Proceedings of the International Conference on Auditory Display. E. Brazil and B. Shinn-Cunningham, Eds., 152--156.
[7]
Butler, R. A. and Belendiuk, K. 1977. Spectral cues utilized in the localization of sound in the median sagittal plane. J. Acoustical Soc. Amer. 61, 5, 1264--1269.
[8]
Caramiaux, B., Susini, P., Bianco, T., Bevilacqua, F., Houix, O., Schnell, N., and Misdariis, N. 2011. Gestural embodiment of environmental sounds: An experimental study. In Proceedings of the International Conference on New Interfaces for Musical Expression. A. R. Jensenius et al., Eds., 144--148
[9]
Carlile, S. and Best, V. 2002. Discrimination of sound source velocity in human listeners. J. Acoustical Soc. Amer. 111, 2, 1026--1035.
[10]
De Bruyn, L., Moelants, D., and Leman, M. 2011. An embodied approach to testing musical empathy in participants with an autism spectrum disorder. Music and Medicine 4, 28--36.
[11]
Delle Monache, S., Polotti, P., and Rocchesso, D. 2010. A toolkit for explorations in sonic interaction design. In Proceedings of the 5th Audio-Mostly Conference: A Conference on Interaction with Sound (AM'10), ACM, New York, 1:1--1:7.
[12]
Delle Monache, S. and Rocchesso, D. 2011. The curse of the where-rabbit: research through design of auditory trajectories. In Proceedings of the 9th ACM SIGCHI Italian Chapter International Conference on Computer-Human Interaction: Facing Complexity (CHItaly). ACM, New York, 79--84.
[13]
Desmet, F., Leman, M., Lesaffre, M., and De Bruyn, L. 2010. Statistical analysis of human body movement and group interactions in response to music. In Advances in Data Analysis, Data Handling and Business Intelligence, A. Fink et al., Eds., Springer, Berlin, 399--408.
[14]
Deutsch, D. 2010. Hearing music in ensembles. Physics Today 63, 2, 40--45.
[15]
Erkut, C., Iylha, A., and Rocchesso, D. 2013. Heigh ho: Rhythmicity in sonic interaction. In Sonic Interaction Design, K. Franinovic and S. Serafin, Eds., MIT Press, Cambridge, MA. To appear.
[16]
Getzmann, S. 2008. The effect of spectral difference on auditory saltation. Experiment. Psychol. 55, 1, 64--71.
[17]
Goldreich, D. 2007. A Bayesian perceptual model replicates the cutaneous rabbit and other tactile spatiotemporal illusions. PLoS ONE 2, 3, e333+.
[18]
Hartmann, W. M. 1999. How we localize sound. Phys. Today 52, 11, 24--29.
[19]
Ihlefeld, A. and Shinn-Cunningham, B. G. 2011. Effect of source spectrum on sound localization in an everyday reverberant room. J. Acoustical Soc. Amer. 130, 1, 324+.
[20]
Jensenius, A. R., Wanderley, M. M., Godøy, R. I., and Leman, M. 2010. Musical gestures: concepts and methods inresearch. In Musical Gestures: Sound, Movement, and Meaning, R. I. Godøy and M. Leman, Eds., Routledge, New York, 12--35.
[21]
Kilian, K. 2009. From brand identity to audio branding. In Audio Branding. Brands, Sound and Communication, K. Bronner and R. Hirt, Eds., Nomos, Baden-Baden, 35--49.
[22]
Kubovy, M. and Schutz, M. 2010. Audio-visual objects. Rev. Philos. Psychol. 1, 41--61. 10.1007/s13164-009-0004-5.
[23]
Lakatos, S. 1993a. Recognition of complex auditory-spatial patterns. Perception 22, 3, 363--374.
[24]
Lakatos, S. 1993b. Temporal constraints on apparent motion in auditory space. Percept. Psychophys. 54, 2, 139--144.
[25]
Lemmens, P., Crompvoets, F., Brokken, D., van den Eerenbeemd, J., and de Vries, G.-J. 2009. A body-conforming tactile jacket to enrich movie viewing. In Proceedings of the World Haptics - 3rd Joint EuroHaptics Cconference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems. IEEE, Los Alamitos, CA, 7--12.
[26]
Mills, A. W. 1958. On the minimum audible angle. J. Acoustical Soc. Amer. 30, 4, 237--246.
[27]
Muller, M. 2007. Information Retrieval for Music and Motion. Springer, Berlin.
[28]
Muller-Tomfelde, C. and Steiner, S. 2001. Audio-enhanced collaboration at an interactive electronic whiteboard. In Proceedings of the International Conference on Auditory Display. 267--271.
[29]
Neuhoff, J. 2004. Auditory motion and localization. In Ecological Psychoacoustics, J. Neuhoff, Ed., Academic Press, New York, 87--111.
[30]
Patel, N. S. and Hughes, D. E. 2012. Revolutionizing human-computer interfaces: The auditory perspective. Interactions 19, 34--37.
[31]
Perrott, D. R. and Saberi, K. 1990. Minimum audible angle thresholds for sources varying in both elevation and azimuth. J. Acoustical Soc. Amer. 87, 4, 1728--1731.
[32]
Phillips, D. P. and Hall, S. E. 2001. Spatial and temporal factors in auditory saltation. J. Acoustical Soc. Amer. 110, 3, 1539--1547.
[33]
Raisamo, J., Raisamo, R., and Surakka, V. 2009. Evaluating the effect of temporal parameters for vibrotactile saltatory patterns. In Proceedings of the International Conference on Multimodal Interfaces (ICMI-MLMI'09), ACM, New York, 319--326.
[34]
Rakerd, B. and Hartmann, W. M. 1985. Localization of sound in rooms, ii: The effects of a single reflecting surface. J. Acoustical Soc. Amer. 78, 2, 524--533.
[35]
Rocchesso, D. and Delle Monache, S. 2011. Spatio-temporal unfolding of sound sequences. In Proceedings of the Sound and Music Computing Conference, 265--272.
[36]
Ruff, R. M. and Perret, E. 1982. Spatial mapping of two-dimensional sound patterns presented sequentially. Percept. Motor Skills 55, 1, 155--163.
[37]
Rusconi, E., Kwan, B., Giordano, B. L., Umilta, C., and Butterworth, B. 2006. Spatial representation of pitch height: The SMARC effect. Cognition 99, 2, 113--129.
[38]
Saberi, K. and Perrott, D. R. 1990. Minimum audible movement angles as a function of sound source trajectory. J. Acoustical Soc. Amer. 88, 6, 2639--2644.
[39]
Shams, L., Kamitani, Y., and Shimojo, S. 2002. Visual illusion induced by sound. Cognitive Brain Res. 14, 1, 147--152.
[40]
Shore, D. I., Hall, S. E., and Klein, R. M. 1998. Auditory saltation: A new measure for an old illusion. J. Acoustical Soc. Amer. 103, 6, 3730--3733.
[41]
Tolkmitt, F. J. and Brindley, R. 1977. Auditory perception of spatiotemporal patterns. Amer. J. Psychol. 90, 1, 73--83.
[42]
Van Valkenburg, D. and Kubovy, M. 2003. In defense of the theory of indispensable attributes. Cognition 87, 225--233.
[43]
Wang, Z. and Ben-Arie, J. 1996. Conveying visual information with spatial auditory patterns. IEEE Trans. Speech Audio Process. 4, 6, 446--455.
[44]
Ware, C. 2004. Information Visualization: Perception for Design. Morgan Kaufmann, San Francisco, CA.
[45]
Wright, M., Cassidy, R J., and F., Z. M. 2004. Audio and gesture latency measurements on Linux and OSX. In Proceedings of the International Computer Music Conference. 423--429.
[46]
Zahorik, P., Brungart, D. S., and Bronkhorst, A. W. 2005. Auditory distance perception in humans: A summary of past and present research. Acta Acustica 91, 3, 409--420.
[47]
Zbyszynski, M., Wright, M., Momeni, A., and Cullen, D. 2007. Ten years of tablet musical interfaces at CNMAT. In Proceedings of the 7th International Conference on New interfaces for Musical Expression. ACM, New York, 100--105.

Cited By

View all
  • (2019)Accessing and selecting menu items by in-air touchProceedings of the 13th Biannual Conference of the Italian SIGCHI Chapter: Designing the next interaction10.1145/3351995.3352053(1-9)Online publication date: 23-Sep-2019
  • (2018)Ecological Invitation to Engage with Public DisplaysProceedings of the 7th ACM International Symposium on Pervasive Displays10.1145/3205873.3210704(1-2)Online publication date: 6-Jun-2018
  • (2018)Musical Vision: an interactive bio-inspired sonification tool to convert images into musicJournal on Multimodal User Interfaces10.1007/s12193-018-0280-4Online publication date: 8-Nov-2018
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Transactions on Applied Perception
ACM Transactions on Applied Perception  Volume 9, Issue 4
October 2012
109 pages
ISSN:1544-3558
EISSN:1544-3965
DOI:10.1145/2355598
Issue’s Table of Contents
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 22 October 2012
Accepted: 01 June 2012
Revised: 01 June 2012
Received: 01 March 2012
Published in TAP Volume 9, Issue 4

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Auditory localization
  2. sonic gestures

Qualifiers

  • Research-article
  • Research
  • Refereed

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)4
  • Downloads (Last 6 weeks)0
Reflects downloads up to 16 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2019)Accessing and selecting menu items by in-air touchProceedings of the 13th Biannual Conference of the Italian SIGCHI Chapter: Designing the next interaction10.1145/3351995.3352053(1-9)Online publication date: 23-Sep-2019
  • (2018)Ecological Invitation to Engage with Public DisplaysProceedings of the 7th ACM International Symposium on Pervasive Displays10.1145/3205873.3210704(1-2)Online publication date: 6-Jun-2018
  • (2018)Musical Vision: an interactive bio-inspired sonification tool to convert images into musicJournal on Multimodal User Interfaces10.1007/s12193-018-0280-4Online publication date: 8-Nov-2018
  • (2013)The Perception of Sound Movements as Expressive GesturesSound, Music, and Motion10.1007/978-3-319-12976-1_30(509-517)Online publication date: 15-Oct-2013

View Options

Login options

Full Access

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media