default search action
NIME 2008: Genova, Italy
- 8th International Conference on New Interfaces for Musical Expression, NIME 2008, Genova, Italy, June 5-7, 2008. nime.org 2008
- David Kim-Boyle:
Network Musics - Play , Engagement and the Democratization of Performance. 3-8 - Álvaro Barbosa:
Ten-Hand Piano: A Networked Music Installation. 9-12 - Michael Wozniewski, Nicolas Bouillot, Zack Settel, Jeremy R. Cooperstock:
Large-Scale Mobile Audio Environments for Collaborative Musical Interaction. 13-18 - Angelo Fraietta:
Open Sound Control: Constraints and Limitations. 19-23 - Matteo Bozzolan, Giovanni Cospito:
SMuSIM: a Prototype of Multichannel Spatialization System with Multimodal Interaction Interface. 24-27 - Chris Nash, Alan F. Blackwell:
Realtime Representation and Gestural Control of Musical Polytempi. 28-33 - Mikael Laurson, Mika Kuuskankare:
Towards Idiomatic and Flexible Score-based Gestural Control with a Scripting Language. 34-37 - Alexandre Bouënard, Sylvie Gibet, Marcelo M. Wanderley:
Enhancing the Visualization of Percussion Gestures by Virtual Character Animation. 38-43 - Diana Young:
Classification of Common Violin Bowing Techniques Using Gesture Data from a Playable Measurement System. 44-48 - Jyri Pakarinen, Vesa Välimäki, Tapio Puputti:
Slide Guitar Synthesizer with Gestural Control. 49-52 - Otso Lähdeoja:
An Approach to Instrument Augmentation: the Electric Guitar. 53-56 - Juhani Räisänen:
Sormina - a New Virtual and Tangible Instrument. 57-60 - Edgar Berdahl, Hans-Christoph Steiner, Collin Oldham:
Practical Hardware and Algorithms for Creating Haptic Musical Instruments. 61-66 - Amit Zoran, Pattie Maes:
Considering Virtual & Physical Aspects in Acoustic Guitar Design. 67-70 - Dylan Menzies:
Virtual Intimacy: Phya as an Instrument. 71-76 - Jennifer Butler:
Creating Pedagogical Etudes for Interactive Instruments. 77-80 - Dan Stowell, Mark D. Plumbley, Nick Bryan-Kinns:
Discourse Analysis Evaluation Method for Expressive Musical Interfaces. 81-86 - Chris Kiefer, Nick Collins, Geraldine Fitzpatrick:
HCI Methodology For Evaluating Musical Controllers: A Case Study. 87-90 - Olivier Bau, Atau Tanaka, Wendy E. Mackay:
The A20: Musical Metaphors for Interface Design. 91-96 - Tobias Grosshauser:
Low Force Pressure Measurement: Pressure Sensor Matrices for Gesture Analysis , Stiffness Recognition and Augmented Instruments. 97-102 - Giuseppe Torre, Javier Torres, Mikael Fernström:
The Development of Motion Tracking Algorithms for Low Cost Inertial Measurement Units. 103-106 - Adrian Freed:
Application of new Fiber and Malleable Materials for Agile Development of Augmented Instruments and Controllers. 107-112 - Alain Crevoisier, Greg Kellum:
Transforming Ordinary Surfaces into Multi-touch Controllers. 113-116 - Nicholas Ward, Kedzie Penfield, Sile O'Modhrain, R. Benjamin Knapp:
A Study of Two Thereminists: Towards Movement Informed Instrument Design. 117-121 - Vassilios-Fivos A. Maniatakos, Christian Jacquemin:
Towards an Affective Gesture Interface for Expressive Music Performance. 122-127 - Anna Källblad, Anders Friberg, Karl Svensson, Elisabet S. Edelholm:
Hoppsa Universum - An Interactive Dance Installation for Children. 128-133 - Antonio Camurri, Corrado Canepa, Paolo Coletta, Barbara Mazzarino, Gualtiero Volpe:
Mappe per Affetti Erranti: a Multimodal System for Social Active Listening and Expressive Performance. 134-139 - Sergio Canazza, Antonina Dattolo:
New Data Structure for Old Musical Open Works. 140-143 - Arne Eigenfeldt, Ajay Kapur:
An Agent-based System for Robotic Musical Performance. 144-149 - Maurizio Goina, Pietro Polotti:
Elementary Gestalts for Gesture Sonification. 150-153 - Stefano Delle Monache, Pietro Polotti, Stefano Papetti, Davide Rocchesso:
Sonically Augmented Found Objects. 154-157 - Jean-Marc Pelletier:
Sonified Motion Flow Fields as a Means of Musical Expression. 158-163 - Josh Dubrau, Mark Havryliv:
P[a]ra[pra]xis: Poetry in Motion. 164-167 - Jan C. Schacher:
Davos Soundscape, a Location Based Interactive Composition. 168-171 - Andrew Schmeder, Adrian Freed:
uOSC: The Open Sound Control Reference Platform for Embedded Devices. 175-180 - Timothy A. Place, Trond Lossius, Alexander Refsum Jensenius, Nils Peters:
Addressing Classes by Differentiating Values and Properties in OSC. 181-184 - Ananya Misra, Georg Essl, Michael Rohs:
Microphone as Sensor in Mobile Phone Performance. 185-188 - Nicolas Bouillot, Michael Wozniewski, Zack Settel, Jeremy R. Cooperstock:
A Mobile Wireless Augmented Guitar. 189-192 - Robert Jacobs, Mark Feldmeier, Joseph A. Paradiso:
A Mobile Music Environment Using a PD Compiler and Wireless Sensors. 193-196 - Ross Bencina, Danielle Wilde, Somaya Langley:
Gesture ≈ Sound Experiments: Process and Mappings. 197-202 - Miha Ciglar:
"3rd. Pole" - A Composition Performed via Gestural Cues. 203-206 - Kjetil Falkenberg Hansen, Marcos Alonso:
More DJ Techniques on the reactable. 207-210 - Smilen Dimitrov, Marcos Alonso, Stefania Serafin:
Developing Block-Movement, Physical-Model Based Objects for the Reactable. 211-214 - Jean-Baptiste Thiebaut, Samer A. Abdallah, Andrew Robertson, Nick Bryan-Kinns, Mark D. Plumbley:
Real Time Gesture Learning and Recognition: Towards Automatic Categorization. 215-218 - Mari Kimura:
Making of VITESSIMO for Augmented Violin: Compositional Process and Performance. 219-220 - Jörn Loviscach:
Programming a Music Synthesizer through Data Mining. 221-224 - Kia Ng, Paolo Nesi:
i-Maestro: Technology-Enhanced Learning and Teaching for Music. 225-228 - Bart Kuyken, Wouter Verstichel, Frederick Bossuyt, Jan Vanfleteren, Michiel Demey, Marc Leman:
The HOP Sensor: Wireless Motion Sensor. 229-232 - Niall Coghlan, R. Benjamin Knapp:
Sensory Chairs: A System for Biosignal Research and Performance. 233-236 - Andrew B. Godbehere, Nathan J. Ward:
Wearable Interfaces for Cyberphysical Musical Expression. 237-240 - Kouki Hayafuchi, Kenji Suzuki:
MusicGlove: A Wearable Musical Controller for Massive Media Library. 241-244 - Michael Zbyszynski:
An Elementary Method for Tablet. 245-248 - Gerard Roma, Anna Xambó:
A Tabletop Waveform Editor for Live Performance. 249-252 - Andrea Valle:
Integrated Algorithmic Composition Fluid systems for including notation in music composition cycle. 253-256 - Andrea Valle:
GeoGraphy: a Real-Time, Graph-Based Composition Environment. 257-260 - Iannis Zannos:
Multi-Platform Development of Audiovisual and Kinetic Installations. 261-264 - Greg J. Corness:
Performer Model: Towards a Framework for Interactive Performance Based on Perceived Intention. 265-268 - Paulo César S. Teles, Aidan Boyle:
Developing an "Antigenous" Art Installation Based on a Touchless Endosystem Interface. 269-272 - Silvia Lanzalone:
The 'Suspended Clarinet' with the 'Uncaused Sound': Description of a Renewed Musical Instrument. 273-276 - Mitsuyo Hashida, Yosuke Ito, Haruhiro Katayose:
A Directable Performance Rendering System: Itopul. 277-280 - William R. Hazlewood, Ian Knopke:
Designing Ambient Musical Information Systems. 281-284 - Aristotelis Hadjakos, Erwin Aitenbichler, Max Mühlhäuser:
The Elbow Piano: Sonification of Piano Playing Movements. 285-288 - Yoshinari Takegawa, Masahiko Tsukamoto:
UnitKeyboard: An Easily Configurable Compact Clavier. 289-292 - Cléo Palacio-Quintin:
Eight Years of Practice on the Hyper-Flute: Technological and Musical Perspectives. 293-298 - Edgar Berdahl, Julius O. Smith III:
A Tangible Virtual Vibrating String: A Physically Motivated Virtual Musical Instrument Interface. 299-302 - Christian Geiger, Holger Reckter, David Paschke, Florian Schulz, Cornelius Poepel:
Towards Participatory Design and Evaluation of Theremin-based Musical Interfaces. 303-306 - Tomás Henriques:
META-EVI Innovative Performance Paths with a Wind Controller. 307-310 - Robin Price, Pedro Rebelo:
Database and Mapping Design for Audiovisual Prepared Radio Set Installation. 311-314 - Kazuhiro Jo, Norihisa Nagano:
Monalisa: "See the Sound , Hear the Image". 315-318 - Andrew Robertson, Mark D. Plumbley, Nick Bryan-Kinns:
A Turing Test for B-Keeper: Evaluating an Interactive. 319-324 - Gabriel Gatzsche, Markus Mehnert, Christian Stöcklmeier:
Interaction with Tonal Pitch Spaces. 325-330 - Parag Chordia, Alex Rae:
Real-Time Raag Recognition for Interactive Music. 331-334 - Anders Vinjar:
Bending Common Music with Physical Models. 335-338 - Margaret Schedel, Alison Rootberg, Elizabeth de Martelly:
Scoring an Interactive, Multimedia Performance Work. 339-342 - Ayaka Endo, Yasuo Kuhara:
Rhythmic Instruments Ensemble Simulator Generating Animation Movies Using Bluetooth Game Controller. 345-346 - Keith A. McMillen:
Stage-Worthy Sensor Bows for Stringed Instruments. 347-348 - Lesley Flanigan, Andrew Doro:
Plink Jet. 349-351 - Yusuke Kamiyama, Mai Tanaka, Hiroya Tanaka:
Oto-Shigure: An Umbrella-Shaped Sound Generator for Musical Expression. 352-353 - Sean Follmer, Chris Warren, Adnan Marquez-Borbon:
The Pond: Interactive Multimedia Installation. 354-355 - Ethan Hartman, Jeff Cooper, Kyle Spratt:
Swing Set: Musical Controllers with Inherent Physical Dynamics. 356-357 - Paul Modler, Tony Myatt:
Video Based Recognition of Hand Gestures by Neural Networks for the Control of Sound and Music. 358-359 - Kenji Suzuki, Miho Kyoya, Takahiro Kamatani, Toshiaki Uchiyama:
beacon: Embodied Sound Media Environment for Socio-Musical Interaction. 360-361 - Eva Sjuve:
Prototype GO: Wireless Controller for Pure Data. 362-363 - Robert Macrae, Simon Dixon:
From Toy to Tutor: Note-Scroller is a Game to Teach Music. 364-365 - Stuart Favilla, Joanne Cannon, Tony Hicks, Dale Chant, Paris Favilla:
Gluisax: Bent Leather Band's Augmented Saxophone Project. 366-369 - Staas de Jong:
The Cyclotactor: Towards a Tactile Platform for Musical Interaction. 370-371 - Michiel Demey, Marc Leman, Frederick Bossuyt, Jan Vanfleteren:
The Musical Synchrotron: Using Wireless Motion Sensors to Study How Social Interaction Affects Synchronization with Musical Tempo. 372-373
manage site settings
To protect your privacy, all features that rely on external API calls from your browser are turned off by default. You need to opt-in for them to become active. All settings here will be stored as cookies with your web browser. For more information see our F.A.Q.