Odds & Ends

New Interfaces for Musical Expression 2015

In early June of this year (apologies for a much-delayed blog post), Mindtribe helped send me to the 15th annual New Interfaces for Musical Expression (NIME) conference. NIME is an academic conference that sits at the intersection of technology, music, interaction design, and performance art. After kicking off with a variety of workshops on building/hacking your own NIME device, over 100 papers, posters, and demos about novel interfaces, including one I worked on, were presented. The conference also set up sound art exhibitions at several local galleries and arranged two concerts each evening, both of which were open to the general public.

To give you a better sense of what kind of creative inventions people showcase at NIME, here were my favorites:

“Rawr! A Study in Sonic Skulls: Embodied Natural History”, by Courtney Brown of Arizona State University

This was the craziest idea by far, made possible by the technology of today. This project involved interviewing paleontologists, acquiring a 3D scanned CT model of a Corythosaurus skull, and experimenting with various 3D printing materials and artificial larynxes to produce what may be what an actual dinosaur call sounded like, and allow people to try pushing the air through and make dinosaur noises with their own lungs. She showed videos of her museum exhibit, where people were allowed to “play” the skull, much to their delight. For all you 3D printer owners who have been looking for ideas to print… why not a dinosaur?

“Soft Revolvers”, by Myriam Bleau of Montreal

Miriam performed this piece in concert, creating a DJ effect using four hefty clear acrylic tops, inside each of which were sensors and a string of LEDs. Each top’s rotational speed and direction was wirelessly transmitted back to a central laptop, where it was mapped to control assorted audio samples and beats. A camera overhead allowed the beauty of the spinning tops to be projected on the screen behind her.

“Beyond Editing: Extended Interaction with Textual Code Fragments”, by Charlie Roberts of UC Santa Barbara

I don’t know when the “live coding” performance was first introduced to the world, but I would look forward to a future where all coding includes the audiovisual features presented in this talk. Using his browser-based audiovisual coding framework, Gibber, Charlie typed, deleted and modified audio synthesis code on the screen that was immediately parsed into corresponding riffs, beats, and tones. But that wasn’t all – the code itself danced too, blinking and fading to provide visual feedback as it ran. Overall, the whole experience was both educational and entertaining.

gibber_NIME2015

These are but the tip of the iceberg. At NIME, pretty much any new technology you could imagine was repurposed for musical expression. Everything was fair game – Kinect and Leap Motion controllers, motion capture systems, electromyography armbands, Wacom tablets, even industrial robot arms. There were various projects that used capacitive touch, RFID, Zigbee radios, and haptic feedback. One concert piece had everyone log onto the same website using our smartphones, repurposing the audiences’ phone speakers to create a 100+ channel soundscape. There were instruments that played themselves, instruments that learned to play with you, and instruments that taught you as you played them. Carefully designed visualizations accompanied some of the pieces, including computer-generated patterns and images on screen and actual physical mediums like resonating pools of water and or lightly sifted piles of salt.

For those of you who are curious about the software, a few audio software names came up often: PureData, MaxMSP, and Jitter, ChucK, Faust and SuperCollider. There were Matlab plots of signal conditioning and filtering, and there was a group at CMU creating an open source machine learning library for PureData/MaxMSP.

Overall, it was an amazing experience. In the past, some NIME devices have gone on to gain fame and fortune: Imogen Heap demo-ed her Mi Mu musical gloves at NIME 2014, Smule’s first hit app, the iPhone Ocarina, was presented at NIME 2009, the REACTable, later used by Bjork at Coachella, was presented at NIME 2003. Perhaps some of NIME 2015’s crazy ideas will hit the mainstream in the near future.

As I left the conference, I felt I had been exposed to a whole world of potential new interactions that could be enhanced by technology (and music!). I also had become more aware of all the sounds that surround us everyday and how they can affect us, from mechanical buzzing to the sound of water droplets.

So on that note, let’s get you, dear reader, into the frame of mind of creating your own New Interface for Musical Expression. Your basic building blocks are:

  • any gesture or interaction of your choice
  • the output, usually a sound or sample, but could also include visual and haptic feedback
  • a mapping between the gesture to the output

What NIME do you think would be awesome? If you have one you’d like to share, feel free to let me know!