Tumgik
#nime2017
mrblazey · 7 years
Text
NIME 2017 - Personal Highlights
I’ve just returned from my first NIME Conference (New Interfaces for Musical Expression) and wanted to make a record of some of my favourite parts, mainly for my own future reference, but also in the hope that it might be useful to other musicians/makers/researchers.
NIMECraft Workshop – Exploring the Subtleties of Digital Lutherie
Associated poster presentation - http://homes.create.aau.dk/dano/nime17/papers/0074/index.html
Getting thrown into group-based instrument making at 8.15 on the Monday was a great way to get started and meet a good chunk of people. The workshop started with a discussion on how the finer points of ‘digital lutherie’ (Jorda, 2005 - http://mtg.upf.edu/node/449 ) can be disseminated effectively – NIME papers tend to focus on technical details and developments at the expense of all the finer details and artistic choices that go into the look and feel of a finished instrument. My group got a bit sidetracked talking about the differences between making instruments for yourself vs. for other people; for example, when making for yourself you are likely to be happier to put up with material or programming flaws because you know the causes and how to get around the problem. You are also likely to test and problem solve individual aspects of the disassembled instrument, as well as incorporate programming and building developments into a tight feedback loop with your own playing/testing. This approach can afford to be quite haphazard and based on junky/recycled aesthetic and materials, whereas making for other people involves more craft and more engineering – you need to be confident that the instrument can withstand heavy handedness and that a stranger can operate it easily without your help/presence.
After the discussion we got on with modifying prototype DMIs (based on Bela boards) in groups, using a nice big mix of materials and adhesives. One great thing about this workshop was that we all started with a pre-programmed simple instrument. This meant that less tech-savvy folk like me could focus on fun ways to physically modify and actuate the instrument, but those who wanted to could get into the programming side as well – either way, everyone ended up with a working instrument and no one was left stuck at the programming stage. Another great thing that emerged was the way that various groups chose to work together – my group split into two pairs and made one instrument per pair. Some divided the instrument into four sections and worked on one each. Another approach was to think in terms of the ‘whole animal,’ with one group requiring all members at once to play the instrument effectively.
This Digital Lutherie thread is still being followed by Bela and Queen Mary University of London so keep an eye out on their twitters etc if you would like to take part in something like this.
@qmul_mat
@belaplatform
Papers
Designing a Multi-Touch eTextile for Music Performances – Maurin Donneaud, Cedric Honnet, Paul Strohmeier.
http://homes.create.aau.dk/dano/nime17/papers/0002/index.html
One thing I really liked about these guys was the open-source mentality – there was no academic hoodwinking or holding cards close to their chest, but rather a big emphasis on this being something that you can do yourself, including links to all the resources and materials you would need to do so ( https://etextile.github.io/resistiveMatrix/ ) ..not that I experienced any of this hoodwinking at NIME – pretty much everyone was very open to chat about their work and swap info, influences and experiences.
Self-Resonating Feedback Cello: Interfacing gestural and generative processes in improvised performance – Alice Eldridge, Chris Kiefer.
http://homes.create.aau.dk/dano/nime17/papers/0005/index.html
A big intention for Kalimbo has been to streamline all the elements of my performance ecology (acoustic instrument, effects, synths, samples and controls) into one object that does not require the player to remove their hands to manipulate effects etc. The Feedback Cello is a good example of an instrument that does this very well. The inclusion of a speaker and transducers in/on the body adds another dimension of feedback, with the added possibility of manipulating the audio between pickups and speaker with analogue or audio effects, or even another musician’s setup, as they did during one of the concerts with Thor Magnusson. They have also used it as a kind of resonating effects unit for live coding sets, adding some rich physicality to a sound world that can risk being a bit too ‘in the box’.
Fragile Instruments: Constructing Destructable Musical Interfaces - Don Derek Hadad, Xiao Xiao, Tod Machover, Joseph Paradiso.
http://homes.create.aau.dk/dano/nime17/papers/0006/index.html
Some people say that laptop or electronic sets can be too clinical or stark and therefore err towards chin-stroking appreciation and away from more abandoned enjoyment. Xiao pointed out that this probably stems from the expense and associated preciousness of all the equipment involved. Guitars can be expensive but they are readily available, especially to the high profile musicians who have famously smashed them to bits. This paper demonstrates a great method for bringing some destruction and danger into electronic performances. Some people questioned the authenticity of this danger, as the bits being destroyed were basically a proxy for the actual expensive, precious bits of equipment, but I still found the sentiment inspiring. I’m hoping to start work on a performance approach wherein the building of a performance ecology is integral to the performance, in a way that provides an instant narrative as well as legibility of form for the audience. I had already envisioned the deconstruction/disassembly of the ecology as a good way to end the performance, but after this paper it seems so obvious that smashing it to pieces would be way more engaging and fun, not to mention cathartic!
Gibberwocky: New Live-Coding Instruments for Musical Performance - Charles Roberts, Graham Wakefield.
http://homes.create.aau.dk/dano/nime17/papers/0024/index.html
While I have enjoyed quite a few live coding performances by this point, I’ve always seen it as something other people do, rather than something that would benefit my own practice. One reason I have felt this way is that I know from years of experience how to get the kind of sounds I want from certain equipment and software. Some purists might not agree with the approach afforded by Gibberwocky, but during the talk I had a definite lightbulb moment of “I could use that!’ Basically it enables you to easily tie in coding instructions with existing programs like Max or Ableton. For example, I might have a synth that I like to use, knowing that automating 2 or 3 parameters will have a pleasing effect. With Gibberwocky, this could be set up in a generative/algorithmic way. I’m yet to try any live coding, so I don’t know how experienced coders would feel about this approach, but it certainly seemed like something I would like to try out.
Current Iteration of a Course on Physical Interaction Design for Music - Sasha Leitman.
http://homes.create.aau.dk/dano/nime17/papers/0025/index.html
I think this paper is well worth reading for anyone that teaches in the realm of sound art, digital instrument design or any kind of digital creative practice. These areas tend to be approached by people from very different creative backgrounds, with different intentions and most importantly completely different base levels of technical knowledge. The main thing I took from this paper as useful to my own teaching was Sasha’s approach to dealing with this – students are first given a quiz on technical knowledge, complete with answers and directions to online resources which will allow you teach yourself how to get them. This is followed up by another quiz without the answers and finally, depending on individual weak spots, further help and one-on-one tuition to level the playing field.
MM-RT: A Tabletop Musical Instrument for Musical Wonderers  - Akito von Troyer.
http://homes.create.aau.dk/dano/nime17/papers/0035/index.html
Again, connections with my own past and future work made this instrument stand out for me. Part of Kalimbo’s appeal is the less-than-deliberate control system – you may not be able to create precise rhythms for drums or melody, but you can navigate soundscapes and ’find’ beats through gestural exploration. MM-RT also promotes an exploratory approach, but using sounds from physical materials. My upcoming work with the performative ecology building is going to involve individually controlled, non-quantised motorised percussion and un-synced tape loops. Akito’s demonstrations mainly involved short rhythmic loops, but he cited john cage’s generative works as an influence, and when talking to him at his demo he did say these kind of generative polyrythms are possible with MM-RT. The legibility of form is also a big factor in how engaging this instrument is, as the audience sees you pick up various objects and materials and inevitably gets drawn into how each one is about to sound.
Design for Longevity: Ongoing Use of Instruments from NIME 2010-2014 - Fabio Morreale, Andrew McPherson.
http://homes.create.aau.dk/dano/nime17/papers/0036/index.html
I have been developing Kalimbo entirely as a tool for my own performance. However, along the way, a couple of people have asked if they could have one. This is an exciting prospect but got me thinking about what would actually be necessary to allow me to hand one off to someone else and expect it to work. This paper looked at just under 100 instruments presented at NIME, before whittling these down to a tiny handful that became commercially available, regularly used in performances and sold to the public. This, along with the discussion from the digital lutherie workshop and lots of useful feedback from my demo session, gave me the inspiration to develop the instrument into something worth selling on, as well as a pretty good set of blueprints of the requirements to make this a possibility. If you make DMIs and would like them to be successfully sold on to the public, definitely read this and learn from what has worked or failed for others.
SALTO: A System for Musical Expression in the Aerial Arts - Christiana Rose.
http://homes.create.aau.dk/dano/nime17/papers/0058/index.html
Scoring choreography of any kind to music, or writing music for any piece of choreography, is bound to include matching up sound and movement in perfect timing (not always, but often enough…). The approach in this paper allows you to use the performers’ movements, along with things like muscle strain and speed etc., to trigger and generate sounds directly. It’s almost as if the compositional structure is given to you for free, just leaving you with sound design choices. Lots of scope!
Cyther: A Human-Playable, Self-Tuning Robotic Zither - Scott Barton, Ethan Prihar, Paulo Carvalho.
http://homes.create.aau.dk/dano/nime17/papers/0061/index.html
Not too much to say about this one apart from it sounds great and works great – the robotic capabilities are very versatile and dynamically expressive, and the design elegantly places all of the robotic workings beneath the strings, leaving the playing surface completely open to a human performer. As mentioned above, my own work so far is heavily tailored to me being the performer, whereas this instrument can be played by various people to get drastically different results, as demonstrated in the Expressive Machines Musical Instruments concert on Wednesday; Ben Taylor used live coding to generate patterns that would be impossible for a human to achieve, whereas Scott Barton incorporated a lot more human interaction and extended techniques in collaboration with the robotics.
Demos/Posters
Sounding Architecture: Inter-disciplinary Studio at HKU - Álvaro Barbarosa, Thomas Tsang.
http://homes.create.aau.dk/dano/nime17/papers/0010/index.html
Beautiful, large-scale sculptural instruments based on architectural designs.
Live Coding YouTube: Organizing Streaming Media for and Audiovisual Performance - Sang Won Lee, Jungho Bang and George Essl.
http://homes.create.aau.dk/dano/nime17/papers/0049/index.html
Quite a practical, technical paper and not something I’m likely to use myself, but included in my highlights because Sang’s performance on Wednesday with multiple jabbering Donald Trumps was brilliantly terrifying.
Design Considerations for Instruments for Users with Complex Needs in SEN Settings – Asha Blatherwick, Luke Woodbury, Tom Davis.
http://homes.create.aau.dk/dano/nime17/papers/0040/index.html
Asha was in my workshop group when we discussed how instruments intended for people other than yourself need to be self-explanatory and hard-wearing, particularly in SEN settings where explaining how to use an instrument and how not to break it can be difficult.
Robotically Augmented Electric Guitar for Shared Control - Takumi Ogata, Gil Weinberg.
http://homes.create.aau.dk/dano/nime17/papers/0092/index.html
Really cool looking instrument, and again, judging from my observation of a few demo participants, really versatile from player to player. I also really liked the drum sequencer style of the control program, complete with randomiser for a nice variety of precision or chaos.
Performances
I think most of these were filmed, as was my own performance, but they aren’t yet online. I’ll add links when they are available.
Anthony T. Morasco – Listening – Composition based around a very cool homemade piano-toll/music box and soprano singer.
Hans Peter Stubbe - Spatial Piano - Improvisation where the disklavier acts like a second player, reacting to player input.
Sabina Hyoju Ahn - Breath - Amazing performance where sound, light and visuals were generated and controlled with Sabina’s breath, using DIY circuits and a lighting rig made from e-waste that looked like a mini post-apocalyptic city scape.
D. Andrew Stewart and Sang Won Lee - Disappearing: Live Writing – Stream of consciousness typing, generating sounds and seamlessly evolving into beautiful visuals and minimal beats.
Matthew Steinke – Robotic Musical Performance - All of the robotic performances were impressive but this one had the most charm in my opinion. Musically engaging, visually reminiscent of Victorian tinny automatons and with great collage-y visual and audio snippets.
(EDIT - excerpt here - https://youtu.be/f8KkhRJ2Ltc )
Sang Won Lee, Jungho Bang and George Essl – Live Coding YouTube - What’s more terrifying than Trump? Lots of Trumps. Turning them into elements of a piece of music helped take the edge off...
Jeff Snyder – Ghostline - Four performers have their movements tracked by webcam, interacting with an on-screen ‘ghostline’ to trigger sounds. Visualisations and sonifications merge and shift over time in a composition where the sights and sounds are merged perfectly.
Yemin Oh – Time Discontinuum - The sound and video of a piano performance is recorded over one minute. This material is then fragmented, repitched and replayed along with the live performer as his movements across the keyboard dictate how the original material gets regenerated.
Jonghyun Kim – Vehicle Music – Noisy, cheeky and fun. A radio-controlled car generates synth signals through its movements, parading around the performance space between occasionally smashing into cymbals, drinks and most importantly a loud, distorted guitar laying in the centre of the space. An older performance can be seen here  - https://vimeo.com/36848457    
There were a few timing clashes with my performance, sound check and demo, and I opted to prioritise late night concerts over early morning paper sessions so I did miss a handful of talks and performances. I’ve also focussed on things that related to my own work and/or inspired me the most so this list is by no means comprehensive but still, plenty of inspiration for me to be getting on with. Overall, NIME was an amazing event full of great people and I really hope I can be part of the conversation and return in the future.
0 notes