dvapix-blog
dvapix-blog
Untitled
11 posts
Don't wanna be here? Send us removal request.
dvapix-blog · 5 years ago
Text
Sonic Performance - Music Video Game Creation
Week Six – Reflection and Future Development
For future development, I want to create a more customisable experience. I would introduce hotkeys to switch between different music segments, instrumentations and prebuilds, have multiple sampler instruments to choose from and allow more player customisation. The use of switch containers and states are one way of making these ideas a reality. Another aspect I would improve on is the way the effects work by making them more reactive using RTPC controllers as well as adding more of them. It would also be interesting to see if it was possible to be able to take full control without moving, operating fully from hotkeys and toggling between effects and instruments on the fly, even though it would negate some of the interactivity and would lose some of the immersion. Another side of things I want to add to it is level design. It would be interesting to have multiple zones which had different instrumentation linked to it. Rather than manually toggling through switch containers you could have it setup where upon walking into a space, the instrumentation automatically changes and whatever was being played before fades out and into the new music. For example, an initial game design could be two level zones taking the ideas already established in this prototype and turning it into an oriental region. From there an underwater zone could be designed with sound design elements and synthesisers (e.g. bubbles, rain, synths that have long attacks and soft) that give off a similar aesthetic reflective of the environment.
0 notes
dvapix-blog · 5 years ago
Photo
Tumblr media
0 notes
dvapix-blog · 5 years ago
Text
Sonic Performance - Music Video Game Creation
Week Five – Music Container MIDI and implementation
For the performance I decided to take a traditional piece of music used to help teach people learning the Guzheng and added music segments to complement it. I added percussive segments and a flute backing track that I designed myself (very basic to demonstrate the performability of the game as an instrument) which can be toggled on or off with the objects placed in the game when the player interacts with them. Once these were made in reaper I rendered them out as audio and integrated them into the Wwise project as individual segments; whilst also creating a container with them all stacked so that they will all play in time when I hit the command to play. Another addition I made was exporting MIDI data into the project as a music segment. I was able to import the MIDI information for the Guzheng part and link the imported sampler to it. As each note of the sampler is programmed to be recognised within Wwise (which I set up in a previous week), I assigned the sampler to the imported MIDI so that the whole thing can be played with a single button.
Another addition I made was ‘sidechaining’. I wanted the player to be able to make their part stand out from anything in the background. By using a meter effect plug in to the sampler through the use of audio busses, I could output the sampler notes as an RTPC (Real Time Parameter Control) value. The RTPC is attached to the music segments which controls the volume of the backing music and is dependant on how loud the sampler is being played. This will mean that when the player uses the sampler in the game, everything else will duck and make their input the prominent part.
0 notes
dvapix-blog · 5 years ago
Photo
Tumblr media
0 notes
dvapix-blog · 5 years ago
Text
Sonic Performance - Music Video Game Creation
Week Four – Level Design and Audio Integration
To implement the Guzheng, I created a chain of commands that play the sampler. The commands below call the events that call to Wwise to play the individual notes. These commands are all connected to keybinds so that when certain keys on the keyboard are pressed, the individual notes are played in the game. To update the effects that are to be supposed to be implemented for the sampler, I had to give it positional information. As the music is meant to follow the player, I used the Player Character Transform to tell the audio events to use the player positional information.
0 notes
dvapix-blog · 5 years ago
Photo
Tumblr media
0 notes
dvapix-blog · 5 years ago
Text
Sonic Performance - Music Video Game Creation
Week Four – Level Design and Audio Integration
The next stage was to design the environment for the performance. I plan to develop this further into multiple worlds with sounds associated with each immersive realm (ocean world, oriental scene, sci fi etc). However, for the purpose of this performance I opted for a more basic approach. Using some free textural packs, I created a small square space that the player can move around and is confined in. The blue zones within the picture below represent the effect zones. They have auxiliary busses assigned to them from Wwise with differing effects such as reverbs or delays for each zone. As the player moves between the zones, the effects will toggle on or off. The red spheres represent objects that can be interacted with. I plan to use these as a way of turning on or off different segments of backing music that I will be able to perform with.
0 notes
dvapix-blog · 5 years ago
Text
Sonic Performance - Music Video Game Creation
Week Three – Sampler Integration
To integrate the sampler into the project, I placed the recorded samples into a blend container. From there I edited the induvial notes to accommodate missing notes should I need them. Within the advanced settings there is the option to enable MIDI and a keymapping editor. From there you can pick the recorded notes and decide whether you want them transposed over multiple notes. For example, the recordings go from B to D which also misses out C and C#. In the keymapping editor I assigned B to be transposed to both C and C# if necessary. Whereas D is transposed to accommodate D# and so on so that if any future MIDI is implemented, this sampler can handle the extra notes and they won’t be missed out (Wwise has a music segment function which takes MIDI data and stores it, the sampler can then be referenced as an instrument to play that MIDI and provided it is setup correctly, it will perform the MIDI segments).
0 notes
dvapix-blog · 5 years ago
Photo
Tumblr media
0 notes
dvapix-blog · 5 years ago
Text
Sonic Performance - Music Video Game Creation
Week Two – Sampler Recording
The first aspect I wanted to tackle is the sampler instrument. I decided to record my own samples to load into Wwise, choosing the Guzheng as my instrument. Due to recent circumstances with Corona, I was limited to the microphones I had at my own disposal. I used an omnidirectional condenser (Rode NT2A) to capture the inner chamber of the instrument in tandem with a shotgun microphone (Rode NTG2) to capture the plucking of the strings. Due to the design of the Guzheng, I only recorded 5 notes from each octave available – D, E, F#, A and B. The Guzheng is tuned to a pentatonic and this is the standard tuning for the instrument. Fortunately, within Wwise there is a way to “fill in the gaps” of the notes that are missing. I used the iZotope audio editor to clean up the sound where necessary before mixing the two recordings and rendering them as one audio file for the Wwise project.
0 notes
dvapix-blog · 5 years ago
Text
Sonic Performance - Music Video Game Creation
Week One – Research and Planning/Idea
The basic idea is to create a musical performance through video game technology. I plan to utilise the Unreal Engine 4 and Wwise to implement a performance environment which can play musical events and segments that respond to the players action. Using auxiliary game sends I can make the musical components responsive to the player location and toggle different effects dependent on the players movements. Another implementation I want to create is building a playable sampler that is controllable in game. These three components combined should create something representative of a musical performance that is unique to each individual launch of the game environment.
1 note · View note