is71017b
is71017b
IS71017B: Programming for Artists Ii (2017-18)
12 posts
Don't wanna be here? Send us removal request.
is71017b · 7 years ago
Video
vimeo
Week 12 For the final step of the project I spent the remainder of my time debugging and organising my patches then rehearsed the performance as many times as possible. On the day of the pop up I was nervous as I still felt under prepared; but everything went well regardless and the audience responded positively. However, I do feel some aspects were unsuccessful as I experimented with improvising granulator interludes but these were not always in sync with the songs and were difficult to become virtuosic with. Despite this, the mix was on point and I also included flanger/reverb/stutter effects for extra audio reactive elements. From here I want to evaluate the strengths and weaknesses of the project as well as discuss my ideas for future development. Reflecting on the steps outlined in my proposal I feel that I have mostly achieved my initial goals. However, I do feel the concepts of duality and minimalism could have been fleshed out more; which is a criticism I have for most of my projects. Perhaps with more research this could have been better realised but this is something I plan to improve upon in future. Looking back, I also wish I spent more time in Max itself and took full advantage of its generative possibilities as an audio-visual tool. I have become quite fond of the visual programming approach and plan to renew my Max subscription so I can use it in place of Ableton for my final project this year. The minimal visuals of the Duality project are a stepping stone for me to start an everyday practice with Jitter over the summer months. Overall, I believe that I pushed myself with the construction of this AV synthesizer and feel I have begun to conquer the huge learning curve that is Max MSP. In summary I believe the technical aspects were successful but in future I intend to cultivate an equal balance with research. In conclusion, expect to see many more experiments with artistic programming in my career to come!
Attached is a link to the final performance.
1 note · View note
is71017b · 7 years ago
Video
tumblr
Week 11 In this hectic penultimate week, my friend and I began studio sessions to master and mix the samples into workable songs. He uses Reason as his DAW of choice so I sent him my sample pack prior to save time. We worked chronologically starting with the track entitled ‘Creation’ first. For the intro song we kept it focused around a granular bass I had made; but experimented with the spatial dynamics of sounds until we made something very similar to Emptyset. For the next one we produced a techno beat centring around a flanging synth I had jammed with but chopped it up into a main riff of sorts. In our opinion ‘Duplicate Animal’ is the weakest track, since it lacks the punch of the others and became ‘overproduced’ as we spent far too long limiting and compressing all of the levels. To make up for this muddy second beat we returned to sonic experimentation at the end of the production sessions. By creating complex drums consisting of polyrhythms and glitches, we proceeded to build the final song around this anchoring pulse. For this one we hooked up my Max synth to record directly into his audio interface and used minimal FX processing. I think this worked beneficially on the overall sound as ‘Pulsek232’ is definitely the cleanest and most professionally mixed of the whole EP. There we many other layers and hidden elements to each song that I will not begin to detail here for fear of surpassing my word count. However, I am pleased with how the final result turned out especially considering we made it all in five days!  
SoundCloud. (2018). Arkos/Chris Speed Visuals - Duality. [online] Available at: https://soundcloud.com/artilekt/sets/arkoschris-speed-visuals-duality [Accessed 6 Nov. 2017]. 
0 notes
is71017b · 7 years ago
Video
tumblr
Week 10
With time quickly running out, I thought it best to stick with what I already knew and base the performance mainly around Ableton/Max for Live. Though there will still be some of that Max MSP ‘character’ to the sound, the overall musical arrangement will be sculpted using more traditional DAWs, however I will still incorporate Jitter as a video mixer. Rather than wasting time working out how to build my own M4L devices, I instead purchased one that instantly emulates Ryoji Ikeda sounds and compositions. Along with this, I also devoted some time to practising with the Granulator and another FX device called the Buffer Shuffler, which I will exploit for it’s insane stutter effects. So after reaching a comfortable place with my audio setup I began to consider how all of this would be reflected visually. Inspired by the dualistic yin yang symbol, I wrote a simple openFrameworks sketch that depicts a flashing black and white circle. I then recorded my face shaking violently and brought it into VDMX to create an effect that replicates Granular Synthesis. Finally, I amended my synth’s Jitter patch to include the new video files and mapped opacity sliders to my MIDI controller. 
Strange Lines, Ryoji. [Accessed April 2018], https://strangelines.com/shop/max-for-live/ryoji/
RYOJI IKEDA, 2005. Dataplex [Online]. Raster-Noton. [viewed April 2018].  Available from: https://www.youtube.com/watch?v=F5hhFMSAuf4 
Youtube, 2013, [Accessed April 2018], https://www.youtube.com/watch?v=Spem4KiI8ek&t=5s
0 notes
is71017b · 7 years ago
Video
tumblr
Week 9 This week I made a return to practical focused work as opposed to research. To begin this process, I read through some notes and looked at some max patches a friend had sent me. He studies Sound Design at the University of Kent so I found it informative to see how other institutions approach teaching visual programming to students. By modifying these patches to suit my own purpose, I began to create a sound processing pipeline for pulses, rhythms and one shots to musically drive my synth. For these audio samples I began to cultivate a variety of wav files from inside Ableton along with field recordings from my previous sound projects. These were then warped and mangled beyond recognition using flangers, granular synthesis and the Scan Music sonification patch by Andrew Benson. I also recorded many hours of practice with my synthesizer and cut all of this sonic exploration up into useable samples.
After completing this arduous process, I intended to compliment my synth with a generative state based drum machine. I was heavily inspired by the stochastic rhythmic patterns of IDM music (such as Autechre) and wanted to find a way of replicating this. Though there is a ‘Autechre’ max patch leaked online, it is virtually unusable as I soon uncovered when attempting to tidy it up. So my attention soon drifted to another max device known as the Probability-based Audio Sample Drum Machine by Yiannis Ioannides. Though this worked fine, I wanted to try and stick to building my own instruments since I would be better at debugging them if something went wrong. For now, I gave up on this concept but will save it for another project. Instead I opted to build a generative panner that added an element of pseudo-randomness to the stereo image of the synth.
Youtube, 2014, [Accessed April 2018], https://www.youtube.com/watch?v=sgbNJLafX_g  
Jitter Recipes: Book 1, Recipes 0-12 2006, [Accessed March 2018], https://cycling74.com/tutorials/jitter-recipes-book-1
Probability-Based Audio Sample Drum Machine 2011, [Accessed March 2018], https://cycling74.com/projects/probability-based-audio-sample-drum-machine
0 notes
is71017b · 7 years ago
Text
Week 8
Following on from reading the Granular Synthesis chapter in Microsound, I came across a collective with the same name recommended from one of my peers. In their most famous installation MODELL 5, the duo used a similar approach to the sound design technique but adapted it to work with video frames instead of grains. This creates a disturbing time based effect on the sound and vision; which literally blurs the lines between the real and the robotic.  Another huge inspiration for my work with Max on this project is the audiovisual artist Julien Bayle. Though he has also created many performances with minimal visuals, my favourite work of his is a newer piece entitled FRGMENTS. In this he displays video portraits and then cuts them up using Jitter, in a technique inspired by the writer William S. Burroughs. Bayle also experiments with the spatialization of the sound to create a sense of intense sensorial involvement on the viewer. Needless to say I found this visually inspiring and plan to attempt similar approaches to add a human element to my performance. Along with Autechre, another integral sonic and aesthetic influence on my work are a duo named Emptyset. With my intro track I plan to heavily emulate their huge chains of sound processing and create a captivating intro to the set. Finally, to conclude my audio/visual research, I dug deep into the music of Ryoji Ikeda. I first became familiar with his work years ago when reading The Aesthetics of Failure by another electronic musician Kim Cascone. In this, Cascone speaks of Ikeda’s tinnitus inducing experiments with high frequency sound which instantly grabbed my interest. I have always loved his minimalist/maximalist approach, particularly when it comes to visuals, so I plan to build upon this framework for my composition.  
Youtube, 2014, [Accessed April 2018],
https://www.youtube.com/watch?v=micWnrTNNjo
Epidemic, Granular Synthesis, Modell 5. [Accessed April 2018], http://www.epidemic.net/en/art/granularsynthesis/proj/modell5.html 
Vimeo, 2017, [Accessed April 2018],
https://vimeo.com/244002645
Youtube, 2013, [Accessed April 2018],
https://www.youtube.com/watch?v=sFLVLQp5CmQ
AUTECHRE, 2016.  feed1.  In: elseq 1 (WARP 512.1) [Online]. Warp Records [viewed April 2018].  Available from: https://autechre.bleepstores.com/release/73330-autechre-elseq-15 
Cascone, K. (2000). The Aesthetics of Failure: “Post-Digital” Tendencies in Contemporary Computer Music, [online] Computer Music Journal Volume 24 Issue 4, p.12-18. Available at: https://www.mitpressjournals.org/doi/10.1162/014892600559489 [Accessed April 2018].
Youtube, 2015, [Accessed April 2018],
https://www.youtube.com/watch?v=rYqcPo995fc 
0 notes
is71017b · 7 years ago
Text
Week 7
Keeping with the theme of looking to the past to understand the present, upon recommendation from Max V Mathews’s paper, I began to read up on all of the wonderful research into computer music by Bell Labs in the 1970s. My personal favourite alumni from this time is the electronic musician Laurie Spiegel who is well known for developing the algorithmic musical composition software “Music Mouse.’ I am a big fan of her music in general so inevitably I came across a Youtube video in which she describes her process as an artist and expresses her belief that modern electronic musicians should look inwards to their imagination for the synthesis of sounds. This inspired me to continue working on the Max synthesizer as a driving instrument behind my music rather than focusing solely on sampling and remixing other people’s sonic explorations. Another strong female innovator within electronic music is the legendary Suzanne Ciani. When I was looking into the work of John Chowning earlier in the module, I stumbled across hers too and was equally if not more mesmerised. I found another online video in which she explains the basic core principles of analogue sound synthesis in a way that everyone could universally understand. When she speaks of virtuosity with synthesizers I found this encouraged me to practice and rehearse with my new DSP synth just like you would with more traditional musical instruments.
Tero Parviainen, Music Mouse Emulator. [Accessed April 2018], http://www.teropa.info/musicmouse/
LAURIE SPIEGEL, 1980. The Expanding Universe [Online]. Unseen Worlds Records. [viewed April 2018].  Available from: https://unseenworlds.bandcamp.com/album/the-expanding-universe
Youtube, 2014, [Accessed April 2018], https://www.youtube.com/watch?v=q5Dxe0rVuhM
Youtube, 2016, [Accessed January 2018], https://www.youtube.com/watch?v=i1uzjFDQM3c&t=943s
Youtube, 2017, [Accessed April 2018], https://www.youtube.com/watch?v=36WMySitV74
Youtube, 2016, [Accessed April 2018], https://www.youtube.com/watch?v=CFD72PXOmxA
0 notes
is71017b · 7 years ago
Text
Week 6
With my Max/MSP/Jitter synthesizer virtually finalised, I moved back into some academic reading for inspiration and information. I looked to an early journal at the dawn of the 1960s entitled The Digital Computer as a Musical Instrument. Surprisingly, it details many of the theoretical foundations for computer music that we still draw from today such as psychoacoustics, oscillators and signal processing. Based upon my reading of the “MSP Polyphony Tutorial 2: Granular Synthesis”, I decided to then dig deeper and read one of the referenced texts to better understand how the technique works. This was the Granular Synthesis chapter of Curtis Road’s Microsound book in which he explores the theory, anatomy and physics of this almost wholly digital method of sound synthesis. I intend to follow with this path of research over the next few weeks, before returning to the practical construction of my audio-visual performance with a freshly informed perspective. 
M. V. Mathews. The Digital Computer as a Musical Instrument. [PDF] Science, New Series, Vol. 142, No. 3592 (Nov. 1, 1963), pp. 553-557, American Association for the Advancement of Science. Available at http://www.jstor.org/stable/1712380?origin=JSTOR-pdf [Accessed April 2018].
Roads, C  2004, Microsound, MIT Press, Cambridge, Massachusetts.
0 notes
is71017b · 7 years ago
Video
tumblr
Week 5
This week I focused more on the visual aspect of the synth and began to deep dive into Jitter. During Tuesday’s workshop with Matt I was able to fix a bug that had been occurring in which the audio would not initialise on start up. I started by paying attention to the order of execution using the debug probes to follow the signal and discovered that audio would not begin without an initial ADSR envelope. To correct this, I saved an initial state and annotated the patch with more instructions. I then began making visuals by playing around with the jit.catch help patch until I made an alteration I was aesthetically pleased with. I then realised that if you then send a jit.catch signal into a jit.graph object you get an instant Ryoji Ikeda type visualisation! So from here, I then opened up the Datamatrix patch by Francesco Grani and imitated some of his methods. He starts with a filtergraph that filters out audio input which sends to two jit.catches, then jit.matrix subpatches that store the left/right channel into two separate planes, followed by a jit.concact to reconnect the two matrices and a jit.transpose to flip them horizontally. So I combined this with my jit.graph patch and built a simple video mixer for now using jit.xfade. Finally, I encapsulated this into a subpatch and included it in my synth by using the subtractive synthesis filtergraph as a way of alternating the visual output. Somehow, I also found time to connect my MIDI controller to Max with some assistance of Frieda’s ‘External Control’ Patch. Controlling parameters as well as triggering notes in this way makes it feel much more intuitive and musical; I now can’t wait to make some beats using the synth! For the last few weeks I plan on making some more improvements to the synth by adding a tremolo to the stereo image for some generative panning, some visual FX mapped to audio FX and just tidy everything up ready to be made into a Max for Live device to record with.
Youtube, 2017, [Accessed March 2018], https://www.youtube.com/watch?v=Y_fUh90H2QI
Youtube, 2013, [Accessed March 2018], https://www.youtube.com/watch?v=YIbbgdyLxIA
0 notes
is71017b · 7 years ago
Video
tumblr
Week 4
Based upon Frieda’s advice, I now focused more on the practical side of this assignment. At the end of last week, I built a simple wavetable synth inspired from the “Waveshaping” tutorial in the Max documentation. I then made a duplicate of the cycle~ object and multiplied its frequency with a second wavetable for additive synthesis. For control I mapped some buttons to draw Chebyshev polynomials into the wavetable to increase the frequencies. Then to conclude, I added a simple filter system (from the help patch) for subtractive synthesis.
It was not sounding like a professional synthesizer at this stage so I knew that I needed to add some more features. One of my favourite VSTs is the synth “Serum” so I based my layout upon it’s GUI. I separated it into six sections OSCILLATOR A, OSCILLATOR B, FILTER, ENVELOPE, LFO and STEP SEQUENCER. I worked on each part one by one starting with an envelope function to control the ADSR. I then used some Vizzie low frequency oscillators (fed back onto each other) as signals to further modulate the sine waves and generate feedback. To create more complex forms of synthesis I referred back to Frieda’s patches and found a way of including both frequency and ring modulation. Finally, I used the Max tutorial “Audio-Rate Sequencing" to create a step sequencer for the synth and alternate the pitch/amplitude to make something somewhat more musical than before. My dreams of a granular synthesis arpeggiator were shattered when I read through the Max tutorial to find that this was impossible. Despite this, I still want to include this technique in the live performance but instead use Robert Henke’s ‘Granulator’ Max4Live device to process samples in real time.  
This week I also read into Fast Fourier Transform which was interesting but quite difficult to get my head around, let alone include it in the assignment. So far I am quite pleased with how the audio side of my AV synth has turned out however it is still quite buggy at this stage and needs some improvements. I am starting to reconsider performing live with it due to its unpredictability and difficulty to become virtuosic with. Instead I want to record several fully fledged tracks with it, break them down into samples and mix/granulate them within Ableton Live. This way I can have more flexibility with mastering them and making sure they sound good in the stereo image of a live PA system. After some cleaning up of my audio synth I plan to work more on the visual aspect next week. I have already begun this process by emailing Francesco Grani and asking for permission to edit his datamatrix m4l patch.
Wikipedia 2018, [Accessed March 2018], https://en.wikipedia.org/wiki/Chebyshev_polynomials
Serum, Advanced Wavetable Synthesizer, [Accessed March 2018], https://www.xferrecords.com/products/serum
Youtube, 2011, [Accessed March 2018], https://www.youtube.com/watch?v=9pn_b7OUO6I
Granulator II, 2016, [Accessed February 2018], http://roberthenke.com/technology/granulator.html
Wikipedia 2018, [Accessed March 2018], https://en.wikipedia.org/wiki/Fast_Fourier_transform
datamatrix (M4L jitter) 2014, [Accessed March 2018], https://cycling74.com/tools/datamatrix-m4l-jitter
0 notes
is71017b · 7 years ago
Video
tumblr
Week 3
In the third week of my project I finally managed to complete the Kadenze course. The final two sessions were on ‘Max programming paradigms’ and ‘integrating Max with other software and languages.’  The later I found more useful as it showed me how to script in Max using C/Open GL and also how to make Max for Live Devices in Ableton, which will become useful knowledge for later in the process. This week I further developed my understanding of Jitter by digging deep into Andrew Benson’s tutorials. I started with his Jitter cookbook then read a bit about shaders though my discoveries have led me to believe I can make all the visuals I need using predominately jit.poke, jit.catch and jit.matrix.  And then additional transformations using jit.mxform2d, jit.rota and jit.submatrix. In our final lesson on state based response processes we learnt more about how to manipulate data and create emergent properties in Max. This gave me an idea of adding a appregator to my instrument that uses granular synthesis to further warp and process sounds in an unpredictable way. Finally, I had a one to one tutorial with Frieda to discuss my project, to which she seemed to like my idea but said I needed to focus on practise rather than research at this stage. So I then showed her the rastogram patch but could not fully explain how it worked which made me revaluate this method of synthesis. I have now decided to start again and build a much simpler wavetable synth that I actually understand how to use, see attached video.
Kadenze Programming Max: Structuring Interactive Software for Digital Arts. 2016, [Accessed February 2018], https://www.kadenze.com/courses/programming-max-structuring-interactive-software-for-digital-arts-i/ 
Jitter Recipes: Book 1, Recipes 0-12 2006, [Accessed March 2018], https://cycling74.com/tutorials/jitter-recipes-book-1
Your First Shader 2007, [Accessed March 2018], https://cycling74.com/tutorials/your-first-shader
0 notes
is71017b · 7 years ago
Text
Week 2
This week I continued the research stage of the assignment and began to teach myself more about the Jitter matrix. I started by reading the first 17 tutorials of the Jitter documentation and also completed the introduction to Vizzie. In this week’s Kadenze session I learnt about how to utilise interactivity in Max using Arduino, OSC and the MIDI protocol. This was useful since I was able to create a custom midi map for Akai LPD8 controller, which I will later to use to control the finished synth. In terms of contextual research, I watched a documentary about the Classical Avant Garde and found some composer’s approach to minimalist music somewhat inspiring. Finally, I found a post by Cycling 74 to be an interesting reflection on the present/future of modern generative music.
A Meditation on Generative Musics (And Their Forms) 2017, [Accessed February 2018], https://cycling74.com/articles/a-meditation-on-generative-musics-%C2%ABand-their-forms%C2%BB?utm_source=Max+Friends+List&utm_campaign=dcd6ca1580-EMAIL_CAMPAIGN_2018_02_20&utm_medium=email&utm_term=0_6c6d94f223-dcd6ca1580-67379229
Youtube, 2013, [Accessed February 2018], https://www.youtube.com/watch?v=h0NwiTHIhGM
Kadenze Programming Max: Structuring Interactive Software for Digital Arts. 2016, [Accessed February 2018], https://www.kadenze.com/courses/programming-max-structuring-interactive-software-for-digital-arts-i/
0 notes
is71017b · 7 years ago
Text
Week 1
For the first week of the assignment I continued on with the Kadenze course “Programming Max: Structuring Interactive Software for Digital Arts.” This online course has helped me to gain a general understanding of Max MSP Jitter in combination with the techniques that we have learnt in class.  In session 7 I learnt about the various data types of Max in depth and how to convert between them. In one tutorial I learnt about Rastograms which is a technique that yields the exact results I was searching for. In the paper “Raster Scanning: A New Approach to Image Sonification, Sound Visualization, Sound Analysis And Synthesis” by Woon Seung Yeo and Jonathan Berger I learnt many things about the process. The visual output is completely dependent on the audio synthesis, a complete synergy of sonification as it were, perfect for what I wish to do. This discovery of raster scanning is an invaluable technique that I will heavily utilise for this project. I was also inspired by a Sound On Sound article about Autechre and their heavy use of Max MSP in their workflow. To conclude I briefly researched minimalist music yet did not find anything particular inspiring as of yet.
Yeo, W. Berger, J. Raster Scanning: A New Approach to Image Sonification, Sound Visualization, Sound Analysis And Synthesis. [PDF] CCRMA, Department of Music, Stanford University. Available at https://ccrma.stanford.edu/~woony/publications/Yeo_Berger-ICMC06.pdf [Accessed February 2018].
Sound On Sound Autechre (2004), [Accessed February 2018], https://www.soundonsound.com/people/autechre
Wikipedia 2017, [Accessed February 2018], https://en.wikipedia.org/wiki/Minimal_music
0 notes