benasante12
benasante12
MD5357 - Audio Interaction and Code
13 posts
Don't wanna be here? Send us removal request.
benasante12 Ā· 4 years ago
Text
MD5357 - Audio Interaction and Code
INTRODUCTION
My research folio will cover various existing projects and module tasks under the four categories of Framework Hardware,Framework Software,Existing Hardware and Existing Software. Under these categories I will be looking at musical related projects to do with re-purposing existing hardware such as a keyboard and a lamp, as well as how coding software is used to enhance a musical performance; such as live sets in the real world and virtual world. I will be exploring different avenues of modern technology and software and how this exists today and how this can further impact music distribution, production and performance. I will cover all the four categories listed in my research to get a broad range of diverse interesting projects to help narrow down my final project proposal, as I embark on logging projects and explaining it’s form; what it does and what I think it’s purpose is in my own words.Ā 
MY PRACTICE
Currently I am a Music Producer and I have been making music for just over a year now. I started out predominantly making Trap beats but since the start of the year I have been making soundtracks intended for film, drawing on inspiration from the likes of Hans Zimmer, Ludwig Gƶransson and Daft Punk. I have always had a passion for film soundtracks and visuals and the relationship between them and how they work together. I feel this will deeply impact the research I will conduct in the space of Virtual reality and Interactive media and audio visualizers, and how people are utilizing these platforms musically.
0 notes
benasante12 Ā· 4 years ago
Text
SENSEI MULTIDIMENSIONAL SOUND SYNTHESIZER BY OSCAR OMENS
FRAMEWORK HARDWARE
The SENSEI synthesizer created by Oscar Omens as part of his bachelor thesis optimises both Teensy and Arduino features and three FSR sensors to give the user a level of control that most synthesizers do not offer. The core features of this synthesizer is the ability to manipulate the sound using the joystick and the touch of your fingers, using the joystick to determine how far or how close the sound is coming from as well as the richness and overall colour of the sound. The bracelet when playing allows the user to play vibrato on each note and the option to pitch bend each key is there also. My initial thoughts on this synthesizer was one of surprise and intrigue as I’ve never seen a joystick attached to a synth and wearing a gyro bracelet whilst playing reminded me of Imogen's Heap's Mi.Mu gloves. I find that the design and the functionality of this product is really impressive and if this were to be a mainstream product it would sell really well.Ā Ā 
I have stated before that this synth offers the user a control that most others don't offer and could only be achieved in processing and automation in a DAW. My only negative concern on this product is that despite the level of features, the actual sound of the synth is still the same with no presets for different sounds or oscillators to create your own sound. I think these additional features would greatly improve this product and overall this project has inspired me to think about incorporating a distance parameter to my own project so that the user has the option to control how close or how far my music is played alongside visuals.
https://vimeo.com/344125521Ā 
https://forum.pjrc.com/threads/56656-SENSEI-A-synthesizer-with-multidimensional-sound-shaping
0 notes
benasante12 Ā· 4 years ago
Text
MODULE TASK 1 - Basic Audiovisualizer
A patch I did exploring the basic form of a audiovisualizer in Max Msp using the visualtool and computer in built microphone.
https://www.youtube.com/watch?v=W0vhb0EV4-A&feature=youtu.beĀ 
0 notes
benasante12 Ā· 4 years ago
Text
FRAMES GRAPHICAL SPECTRAL PROCESSOR FOR MAX FOR LIVE BY ALBERTO BARBERIS
EXISTING FRAMEWORK SOFTWARE
Frames by Alberto Barberis is a free spectral processing plugin that works with Abelton allowing you to change the sonogram of any sound in real time with a number of different transformations possible to really warp the sound. The main core feature is directly shifting the sonogram of the sound presented in a black and white format and mapping each parameter to such as an LFO to begin manipulating the sound. The past/present feature paired with the play speed are really effective in slowing and dragging the sound to make it unrecognisable to what the instrument or sound is. From what I have seen in the video this would work really well for achieving an ethereal and glitchy sound that gives off a cinematic sci-fi feel. My personal thoughts on this plugin is that I wish it was available for Fl Studio and not just for Ableton, I really think this plugin is a highly effective warping tool that plugins like Halftime and Gross beat which a commercial plugins don’t offer the same level of transformation to a sound like Frames.Ā 
I think what could enhance this plugin even more would be if there were a kind of panner built into Frames which the user could assign to a specific change in the sonogram to go left or right and would add a more immersive quality to the sound. Moreover, what I take from this project is not necessarily building a plugin like this as I do have an idea for music reacting to visuals; it is the thought of whilst my music is playing that the listener can warp the sound somehow by clicking on the visual whether it be filtering the sound or glitching it entirely.
https://albertobarberis.github.io/FRAMES/
0 notes
benasante12 Ā· 4 years ago
Text
DIY POLYPHONIC DRUM Ā MACHINE AND LOOPER
FRAMEWORK HARDWARE
This project is a simple yet a highly effective drum machine looper that is made for Teensy and can easily be ported to Arudino. The drum machine has four sounds that sound exactly like the Roland TR-808, which is famous for having revolutionized Hip-Hop and RnB production with the signature cowbell sound and 808. However, with this drum machine like the TR-808 the four sounds can be played simultaneously giving the user the control to make some funky patterns and rhythmic beats. The tempo can also be controlled as well with 16 subdivision patterns programmed in. I believe this project was created to serve as an alternative DIY cost efficient drum machine, for music producers and creative practitioners alike, who perhaps can’t afford the expensive vintage gear. I also think this project was created for novelty purposes as it is something unusual and isn’t normally seen like this in it’s rough teensy board state.Ā 
Although the technology behind it is probably the same used in most drum machines sold today.Ā 
I really like this project as a whole and I would be interested in doing something like this in the future. However, I do think it could be improved by having more sounds and more control in deciding what pattern each sound is played in. An example of this is that you could set one sound to play ā…“ and another in ā…™; this would open up more control for the user and effect the rhythms you could create and enhance the musical production.
https://www.youtube.com/watch?v=uVq1ErSdXGY&feature=emb_title
0 notes
benasante12 Ā· 4 years ago
Text
GENETIC ALGORITHM IN PYTHON GENERATES MUSIC BY KIE CODES
FRAMEWORK SOFTWARE
This was an interesting project I came across by Kie Codes, this project focuses on an algorithm programmed in Python to generate melodies and basslines which can then be exported into a DAW and arranged into a song. The rationale and purpose of this project was really an experiment to see if a computer could generate a melodic melody that sounds good. The main feature on how this algorithm works is by Python playing a melody to you and the listener rating the melody from a scale of one to five, if the score is low Python will adjust and make changes to the melody and essentially cater towards the listeners taste until the score is high to a certain degree. I was quite surprised at the final result and arrangement towards the end of the Youtbe video, as the bassline and melodies sounded good, heavily computerized of course but I could see how this algorithm could be used to generate ideas for melodies which you could expand upon. However, despite this project optimising machine learning, I still would argue that the human touch within music is always better then something artificial; as a Producer myself even when I work with MIDI, I try to off-center some of the notes and adjust velocities accordingly for a human touch and to avoid everything sounding too perfect. I know that in the case of this project you can do the same but I stil am unsure from a musical standpoint if artists and musicians should really rely on an algorithm to generate melodies for them.
https://www.youtube.com/watch?v=uQj5UNhCPuo
0 notes
benasante12 Ā· 4 years ago
Text
DIY LED FACE MASK USING ARDUINO
EXISTING FRAMEWORK HARDWARE
This LED light up face mask is literally achieved with eight LED stripsĀ  linked up to an arduino motherboard allowing the mask to emit light,text and even emojis that would be perfect for wearing at live shows. Although I am more inclined to a framework software approach, I found this project rather interesting due to how it reminded me of Daft Punk’s helmets when they were interviewed around their Discovery album in Japan. Even the emojis that emit from this face mask are even similar to that of Daft Punk's helmets although probably not as sheike or expensive as what they are but a definite similar result in terms of the technology behind it. Moreover, the way the person in the video achieved this looks relatively simple, exporting the code of the RGB grid with a drawn object on into Arduino and tweaking the code to achieve different icons and words. Overall I think it is a solid project with the intention to serve as a much more enhanced face mask for Covid 19. I do however, think the design of the mask when it comes to wearing could be improved on with something more comfortable to sit round the ears than LED strips but as a whole quite good. I think what I take from this project is a potential idea to incorporate light and icons as part of the visual element to go with my music either in the virtual world or physically such as this face mask if the LED light could interact with the music.Ā 
https://www.youtube.com/watch?v=zUG7pcuzTM8&feature=emb_title
0 notes
benasante12 Ā· 4 years ago
Text
MODULE TASK 2 - Arduino and Abelton
Controlling a low pass filter with a LDR sensor and reverb with a potentiometer in Abelton.Ā 
https://www.youtube.com/watch?v=j4vVMEOj9zQ&feature=youtu.be
0 notes
benasante12 Ā· 4 years ago
Text
MUSIC REACTIVE DESK LIGHT BY NERDFORGE
EXISTING FRAMEWORK HARDWARE
Repurposing an ikea jar and using an Arduino nano and sound detector this simple desk light reacts to sound emitting different colours, it doesn’t however, react to the intensity of the sound. Hansi from the Nerdforge team being a Computer Scientist wanted to build a music visualizer but stated it ā€˜seemed awfully complicated’ but had a change of mind when browsing the interweb and discovering a sound detector module and realising that it was an actual achievable project. The light is powered by a twelve volt power source and the lights work by the basic programming in Arduino paired with a sound detector with glued LED strips around a PVC pipe. Overall I do like this project because of the compactness of the light but I do think there is potential to scale the light and the code even further; as the desk light only detects the sound and emits the light according to that but not the intensity as I stated earlier. I think if the code were to be tweaked this feature would greatly enhance the light and visualization aspect and if this were to be as part of a live show it would definitely be a great asset. In terms of how this project has inspired me in what I could possibly do this definitely lies within the light reacting element. However, unlike this desk light being reactive in the physical world I am more drawn to the light possibly reacting in a virtual sense to my music.
https://www.youtube.com/watch?v=5oRir4dck_wĀ 
0 notes
benasante12 Ā· 4 years ago
Text
STRUCTURE.LIVE by JULIEN BAYLE
EXISTING FRAMEWORK SOFTWARE
STRUCTURE.LIVE is an audiovisual live performance where the visuals interact with the music. I was quite captivated by the striking visuals and the pulsation of the kicks and electronic synths creating this sonic world. It is less of a narrative piece unlike traditional music videos that have some sort of story and concept and visible performers but Structure.Live rather being about the expression of movement within a sonic space. Looking behind the scenes at how Juilen Bayle has achieved this I'm still unsure exactly how, but I caught a glimpse of a Max msp patch probably using the Jit.world object synced to some kind of controller that pairs with Ableton. Upon watching this video I have looked into jitter visuals using Max msp to achieve a similar visual without the use of recording actual performers like Juilen Bayle, which is something I am considering doing as a project having my own music that I have produced to react to visuals that fit an overall aesthetic to enhance the viewing and listening experience. The rationale behind this project as described on Julien Bayle website is ā€˜Addressing minimal & abstract concepts such continuum / disruption or order / chaos, the artist continues his exploration of radical sonic minimalism withSTRUCTURE, by bringing a new bridge between sound & space’. I definitely think he has achieved this to a very high standard. I will definitely take away using Max msp to achieve something like this of my own.
http://julienbayle.net/works/structure-live/
0 notes
benasante12 Ā· 4 years ago
Text
MODULE TASK 3 - 3D TERRAIN AND SENSORS
A tester patch exploring what I could do for my final project.
https://www.youtube.com/watch?v=RgiSAsZVMxY&feature=youtu.be
0 notes
benasante12 Ā· 4 years ago
Text
THE ENTROPY GARDENS A SPATIOTEMPORAL VR POEM CREATED BY LEONHARD LASS AND GREGOR LADENHAUF
FRAMEWORK SOFTWARE
The Entropy Gardens is another example of an audiovisual performance with more spectacle taking advantage of a VR format for further immersion. This piece of work was achieved using Unity; a cross platform game-engine created by Unity technologies and used for a variety of different things such as vfx and animations. Essentially the video takes you on a virtual trip through a garden that changes over the course of time, with rocks floating over the viewer’s head and even the viewer him/herself being lifted to the sky in this 3D virtual world. The whispers of the dialogue intermittently appear as the poem opens up allowing the viewer to reflect in this virtual world. I found the experience somewhat strange and I did like the experience as a whole. However, I think perhaps giving the viewer a chance to look around in a 360 environment would have made the piece even more immersive and this is the only negative view I have on this piece of work. What I do take from this and do like is the aspect of the viewer moving around and being taken on a journey with the sound design aiding the sense of movement and being in a different world. The rationale I believe is to challenge the viewer's sense of perception in a contemplative way in this virtual world. This has made me think more about what I said in my previous work about music to visuals and how I could perhaps make the experience more immersive if the viewer was to be moved around in a 3D Space.Ā 
https://www.creativeapplications.net/environment/the-entropy-gardens-a-spatiotemporal-poem-in-vr
0 notes
benasante12 Ā· 4 years ago
Text
PROPOSAL
BRIEF
I want to produce a max patch that enables my music to react to visuals that are of an abstract form. This will be predominantly within a framework software based project utilising Max Msp to code the patch and visuals with the aim to create a new surreal experience for the listener.
RATIONALEĀ Ā 
The reason why I want to do this project is that music nowadays is often only consumed through streaming platforms such as Spotify, Deezer and Tidal with vinyl being a niche minority; putting the listener in a spot where they are only listening to music and not seeing and experiencing the artform alongside visuals which can further impact how the listener interprets the music. My aim in doing this project is as a means of an alternative distribution that presently current streaming services are not offering. There is however, Vevo on YouTube and the music video industry has not totally vanished but I want this project to take music and visuals back to its organic state to encourage a much deeper self-reflection and to enable the listener to interpret their own meaning. Therefore, because my music is a cross between classical instruments and synthesizers I will be aiming this at people who enjoy electronic/pop with a cinematic backdrop, targeted at the age group between 16 - 35 years. I would hope this appeals more to an older audience because ofĀ  the abstract nature of the visuals.Ā 
THE PLAN
I am going to realise this project by first familiarising myself more with Max Msp and how the jit.world function is really the key to visuals; working and operating within the format. This will be done by watching tutorials online and following along how other people have produced simular projects and drawing ideas and example patches to make my own unique patch for this project. The resources I will need other than Max Msp is royalty free footage to be the basis of the visuals or even creating that myself in Max Msp or Blender.Ā 
THE RISKS
The risks and shortcomings I could possibly see would be within the Max patch solely as everything really depends on the code in order to get the visuals synced and reacting to the music. I could also see a problem creating a patch in the visual style I want to achieve as I am quite dependent on tutorials in order to get the base code and make tweaks to my liking to realise this project.
THE LEGACYĀ 
I believe and hope that after the project is complete that the video would be a form of art to experience and to have successfully achieved my vision. I also believe that this video will have value as it is not what most artists and musicians are putting out there. This would be an added form of expression that caters to my nich market which I am targetting. Finally, I believe that the listener will enjoy the video and that they would hope to see more produced along these lines in the future.
0 notes