default search action
17th NIME 2017: Copenhagen, Denmark
- Cumhur Erkut:
17th International Conference on New Interfaces for Musical Expression, NIME 2017, Aalborg University, Copenhagen, Denmark, May 15-18, 2017. nime.org 2017 - Robert Van Rooyen, Andrew Schloss, George Tzanetakis:
Voice coil actuators for percussion robotics. 1-6 - Maurin Donneaud, Cédric Honnet, Paul Strohmeier:
Designing a multi-touch etextile for music performances. 7-12 - Peter Williams, Daniel Overholt:
Beads extended actuated digital shaker. 13-18 - Romain Michon, Julius O. Smith III, Matthew Wright, Christopher Chafe, John Granzow, Ge Wang:
Passively augmenting mobile devices towards hybrid musical instrument design. 19-24 - Alice Eldridge, Chris Kiefer:
Self-resonating feedback cello: interfacing gestural and generative processes in improvised performance. 25-29 - Don Derek Haddad, Xiao Xiao, Tod Machover, Joe A. Paradiso:
Fragile instruments: constructing destructible musical interfaces. 30-33 - Florian Heller, Irene Meying Cheung Ruiz, Jan O. Borchers:
An augmented flute for beginners. 34-37 - Gabriella Isaac, Lauren Hayes, Todd Ingalls:
Cross-modal terrains: navigating sonic space through haptic feedback. 38-41 - J. Cecilia Wu, Mark Rau, Yun Zhang, Yijun Zhou, Matthew James Wright:
Towards robust tracking with an unreliable motion sensor using machine learning. 42-47 - Álvaro Barbosa, Thomas Tsang:
Sounding architecture: inter-disciplinary studio at hku. 48-51 - Martin Matus Lerner:
Osiris: a liquid based digital musical instrument. 52-55 - Spyridon Stasis, Jason Hockman, Ryan Stables:
Navigating descriptive sub-representations of musical timbre. 56-61 - Peter Williams, Dan Overholt:
Pitch fork: a novel tactile digital musical instrument. 62-64 - Çagri Erdem, Anil Çamci, Angus G. Forbes:
Biostomp: a biocontrol system for embodied performance using mechanomyography. 65-70 - Esben W. Knudsen, Malte L. Hølledig, Mads Juel Nielsen, Rikke K. Petersen, Sebastian Bach-Nielsen, Bogdan-Constantin Zanescu, Dan Overholt, Hendrik Purwins, Kim Helweg:
Audio-visual feedback for self-monitoring posture in ballet training. 71-76 - Rikard Lindell, Tomas Kumlin:
Augmented embodied performance - extended artistic room, enacted teacher, and humanisation of technology. 77-82 - Jens Vetter, Sarah Leimcke:
Homo restis - constructive control through modular string topologies. 83-86 - Jerônimo Barbosa, Marcelo M. Wanderley, Stéphane Huot:
Exploring playfulness in nime design: the case of live looping tools. 87-92 - Daniel Manesh, Eran Egozy:
Exquisite score: a system for collaborative musical composition. 93-98 - Stahl Stenslie, Kjell Tore Innervik, Ivar Frounberg, Thom Johansen:
Somatic sound in performative contexts. 99-103 - Jeppe Veirum Larsen, Hendrik Knoche:
States and sound: modelling interactions with musical user interfaces. 104-109 - Guangyu Xia, Roger B. Dannenberg:
Improvised duet interaction: learning improvisation techniques for automatic accompaniment. 110-114 - Palle Dahlstedt:
Physical interactions with digital strings - a hybrid approach to a digital keyboard instrument. 115-120 - Charles Roberts, Graham Wakefield:
Gibberwocky: new live-coding instruments for musical performance. 121-126 - Sasha Leitman:
Current iteration of a course on physical interaction design for music. 127-132 - Alex Hofmann, Bernt Isak Wærstad, Saranya Balasubramanian, Kristoffer E. Koch:
From interface design to the software instrument - mapping as an approach to fx-instrument building. 133-138 - Marco Marchini, François Pachet, Benoît Carré:
Rethinking reflexive looper for structured pop music. 139-144 - Victor Zappi, Andrew Allen, Sidney S. Fels:
Shader-based physical modelling for the design of massive digital musical instruments. 145-150 - David Johnson, George Tzanetakis:
Vrmin: using mixed reality to augment the theremin for musical tutoring. 151-156 - Richard Graham, Brian Bridges, Christopher Manzione, William Brent:
Exploring pitch and timbre through 3d spaces: embodied models in virtual reality as a basis for performance systems design. 157-162 - Michael Gurevich:
Discovering instruments in scores: a repertoire-driven approach to designing new interfaces for musical expression. 163-168 - Joe Cantrell:
Designing intent: defining critical meaning for nime practitioners. 169-173 - Juan C. Vasquez, Koray Tahiroglu, Johan Kildal:
Idiomatic composition practices for new musical instruments: context, background and current applications. 174-179 - Florent Berthaut, Cagan Arslan, Laurent Grisoni:
Revgest: augmenting gestural musical instruments with revealed virtual objects. 180-185 - Akito van Troyer:
Mm-rt: a tabletop musical instrument for musical wonderers. 186-191 - Fabio Morreale, Andrew P. McPherson:
Design for longevity: ongoing use of instruments from nime 2010-14. 192-197 - Samuel Delalez, Christophe d'Alessandro:
Vokinesis: syllabic control points for performative singing synthesis. 198-203 - Gareth W. Young, Dave Murphy, Jeffrey Weeter:
A qualitative analysis of haptic feedback in music focused exercises. 204-209 - Jingyin He, Jim W. Murphy, Dale A. Carnegie, Ajay Kapur:
Towards related-dedicated input devices for parametrically rich mechatronic musical instruments. 210-215 - Asha Blatherwick, Luke Woodbury, Tom Davis:
Design considerations for instruments for users with complex needs in sen settings. 216-221 - Abram Hindle, Daryl Posnett:
Performance with an electronically excited didgeridoo. 222-226 - Michael Zbyszynski, Mick Grierson, Matthew Yee-King:
Rapid prototyping of new instruments with codecircle. 227-230 - Federico Visi, Baptiste Caramiaux, Michael Mcloughlin, Eduardo Reck Miranda:
A knowledge-based, data-driven method for action-sound mapping. 231-236 - Spencer Salazar, Mark Cerqueira:
Chuckpad: social coding for computer music. 237-240 - Axel Berndt, Simon Waloschek, Aristotelis Hadjakos, Alexander Leemhuis:
Ambidice: an ambient music interface for tabletop role-playing games. 241-244 - Sam Ferguson, Anthony Rowe, Oliver Bown, Liam Birtles, Chris Bennewith:
Sound design for a system of 1000 distributed independent audio-visual devices. 245-250 - Richard Vogl, Peter Knees:
An intelligent drum machine for electronic dance music production and performance. 251-256 - Martin Snejbjerg Jensen, Ole Adrian Heggli, Patricia Alves Da Mota, Peter Vuust:
A low-cost mri compatible keyboard. 257-260 - Sang Won Lee, Jungho Bang, Georg Essl:
Live coding youtube: organizing streaming media for an audiovisual performance. 261-266 - Solen Kiratli, Akshay Cadambi, Yon Visell:
Hive: an interactive sculpture for musical expression. 267-270 - Matthew Blessing, Edgar Berdahl:
The joystyx: a quartet of embedded acoustic instruments. 271-274 - Graham Wakefield, Charles Roberts:
A virtual machine for live coding language design. 275-278 - Tom Davis:
The feral cello: a philosophically informed approach to an actuated instrument. 279-282 - Francisco Bernardo, Nicholas Arner, Paul Batchelor:
O soli mio: exploring millimeter wave radar for musical interaction. 283-286 - Constanza Levicán, Andrés Aparicio, Vernon Belaúnde, Rodrigo F. Cádiz:
Insight2osc: using the brain and the body as a musical instrument with the emotiv insight. 287-290 - Benjamin D. Smith, Neal Andersen:
Arraynger: new interface for interactive 360° spatialization. 291-295 - Alexandra Murray-Leslie, Andrew Johnston:
The liberation of the feet: demaking the high heeled shoe for theatrical audio-visual expression. 296-301 - Christiana Rose:
Salto: a system for musical expression in the aerial arts. 302-306 - Marije A. J. Baalman:
Wireless sensing for artistic applications, a reflection on sense/stage to motivate the design of the next stage. 307-312 - Ivica Bukvic, Spencer Lee:
Glasstra: exploring the use of an inconspicuous head mounted display in a live technology-mediated music performance. 313-318 - Scott Barton, Ethan Prihar, Paulo Carvalho:
Cyther: a human-playable, self-tuning robotic zither. 319-324 - Beici Liang, György Fazekas, Andrew P. McPherson, Mark B. Sandler:
Piano pedaller: a measurement system for classification and visualisation of piano pedalling techniques. 325-329 - Jason Long, Jim W. Murphy, Dale A. Carnegie, Ajay Kapur:
A closed-loop control system for robotic hi-hats. 330-335 - Stratos Kountouras, Ioannis Zannos:
Gestus: teaching soundscape composition and performance with a tangible interface. 336-341 - Hazar Emre Tez, Nick Bryan-Kinns:
Exploring the effect of interface constraints on live collaborative music improvisation. 342-347 - Irmandy Wicaksono, Joseph A. Paradiso:
Fabrickeyboard: multimodal textile sensate media as an expressive and deformable musical interface. 348-353 - Kristians Konovalovs, Jelizaveta Zovnercuka, Ali Adjorlu, Daniel Overholt:
A wearable foot-mounted / instrument-mounted effect controller: design and evaluation. 354-357 - Herbert H. C. Chang, Lloyd May, Spencer Topel:
Nonlinear acoustic synthesis in augmented musical instruments. 358-363 - Georg Hajdu, Benedict Carey, Goran Lazarevic, Eckhard Weymann:
From atmosphere to intervention: the circular dynamic of installations in hospital waiting areas. 364-369 - Dom Brown, Chris Nash, Tom Mitchell:
A user experience review of music interaction evaluations. 370-375 - Wayne Siegel:
Conducting sound in space. 376-380 - Spencer Salazar, Sarah Reid, Daniel McNamara:
The fragment string. 381-386 - Staas de Jong:
Ghostfinger: a novel platform for fully computational fingertip controllers. 387-392 - Jack Armitage, Fabio Morreale, Andrew P. McPherson:
The finer the musician, the smaller the details: nimecraft under the microscope. 393-398 - Sandor Mehes, Maarten van Walstijn, Paul Stapleton:
Virtual-acoustic instrument design: exploring the parameter space of a string-plate model. 399-403 - Nicolas Bouillot, Zack Settel, Michal Seta:
Satie: a live and scalable 3d audio scene rendering environment for large multi-channel loudspeaker configurations. 404-409 - Hugo Scurto, Frédéric Bevilacqua, Jules Françoise:
Shaping and exploring interactive motion-sound mappings using online clustering techniques. 410-415 - Kiran Bhumber, Bob Pritchard, Kitty Rodé:
A responsive user body suit (rubs). 416-419 - Marie Koldkjær Højlund, Morten S. Riis, Daniel Rothmann, Jonas R. Kirkegaard:
Applying the ebu r128 loudness standard in live-streaming sound sculptures. 420-425 - Edgar Berdahl, Matthew Blessing, Matthew Williams, Pacco Tan, Brygg Ullmer, Jesse T. Allison:
Spatial audio approaches for embedded sound art installations with loudspeaker line arrays. 426-430 - Fiona Keenan, Sandra Pauletto:
Design and evaluation of a digital theatre wind machine. 431-435 - Ian Hattwick, Marcelo M. Wanderley:
Design of hardware systems for professional artistic applications. 436-441 - Alexander Refsum Jensenius, Victor E. González Sánchez, Agata Zelechowska, Kari Anne Vadstensvik Bjerkestrand:
Exploring the myo controller for sonic microinteraction. 442-445 - Joseph Tilbian, Andrés Cabrera:
Stride for interactive musical instrument design. 446-449 - José Miguel Fernandez, Thomas Köppel, Nina Verstraete, Grégoire Lorieux, Alexander Vert, Philippe Spiesser:
Gekipe, a gesture-based interface for audiovisual performance. 450-455 - Jeppe Veirum Larsen, Hendrik Knoche:
Hear you later alligator: how delayed auditory feedback affects non-musically trained people's strumming. 456-459 - Michael Mulshine, Jeff Snyder:
Oops: an audio synthesis library in c for embedded (and other) applications. 460-463 - Maria Kallionpää, Chris Greenhalgh, Adrian Hazzard, David M. Weigl, Kevin R. Page, Steve Benford:
Composing and realising a game-like performance for disklavier and electronics. 464-469 - Thor Magnusson:
Contextualizing musical organics: an ad-hoc organological classification approach. 470-475 - Stefano Fasciani:
Physical audio digital filters. 476-480 - Benjamin Taylor:
A history of the audience as a speaker array. 481-486 - Takumi Ogata, Gil Weinberg:
Robotically augmented electric guitar for shared control. 487-488 - Ben Neill:
The mutantrumpet. 489-490 - Scott Smallwood:
Locus sono: a listening game for nime. 491-492 - Richard Polfreman, Benjamin Oliver:
Rubik's cube, music's cube. 493-494 - Charles P. Martin, Jim Tørresen:
Microjam: an app for sharing tiny touch-screen performances. 495-496 - Ryu Nakagawa, Shotaro Hirata:
Aeve: an audiovisual experience using vrhmd and eeg. 497-498 - Rodrigo F. Cádiz, Alvaro Sylleros:
Arcontinuo: the instrument of change. 499-500 - Rob Blazey:
Kalimbo: an extended thumb piano and minimal control interface. 501-502 - Joseph Tilbian, Andrés Cabrera, Steffen Martin, Lukasz Olczyk:
Stride on saturn m7 for interactive musical instrument design. 503-504 - Tetsuro Kitahara, Sergio I. Giraldo, Rafael Ramírez:
Jamsketch: a drawing-based real-time evolutionary improvisation support system. 505-506 - Jacob Harrison, Andrew P. McPherson:
An adapted bass guitar for one-handed playing. 507-508 - Krzysztof Cybulski:
Feedboxes. 509-510 - Seth Glickman, Byunghwan Lee, Fu Yen Hsiao, Shantanu Das:
Music everywhere - augmented reality piano improvisation learning system. 511-512 - Juan Bender, Gabriel Lecup, Sergio Fernandez:
Song kernel - explorations in intuitive use of harmony. 513-514
manage site settings
To protect your privacy, all features that rely on external API calls from your browser are turned off by default. You need to opt-in for them to become active. All settings here will be stored as cookies with your web browser. For more information see our F.A.Q.