Hi there! Welcome to my blog page. My name is Chiara Louise, I am an upcoming Sound Designer from Scotland, and this blog is going to help demostrate my skills as a Sound Designer through out various projects that I will upload on to the blog on an irregular basis.
Don't wanna be here? Send us removal request.
Text
This is a sneak peak featuring one of the newly animated scenes for my Autism Short Film. This film is going to be my project deliverable for my Honours Project in University. This animation was created using Adobe After Effects and Adobe Illustrator. Illustrator was used to create all of the visual assets from the room design, to the characters design and the props (e.g. school bag). The only visual that wasn’t made in Illustrate was the characters arms and legs. The animators and I used a Rubber Hose plugin to create the arms. I am missing the sound mix at this stage. I’m going to create the sound mix for the short film in 2025. The final render of the Short Film will be released in May 2025. Stay Tuned!
0 notes
Text
Hi everyone! Today’s blog post is a showreel demonstrating my skills as a Sound Designer by reproducing and mixing targeted sound elements using some of my favourite Disney Films. This project has allowed myself to add a new Sound Design twist to some of beloved Disney films The sound elements that I am focusing on are the sounds for; Vehicles, Weather Systems, Autonomous Robots, Ambiences, Fictional Creatures, and User Interfaces for computers. I have used ProTools to mix and master all of these sound elements to demonstrate my skills as a sound designer. This project has also allowed myself to improve my skills using a digital audio workstation to edit, mix and master audio for a video. It has also allowed myself to improve my recording techniques for Foley SFX and Dialogue. This project has also shown that I still need to improve on my detail in sound by adding more sound elements to projects such as this and I also need to improve my mixing skills when I am working with digital audio workstations. This project does not feature any music at all. The reason why, is because it gives viewers the opportunity to listen to all of the sound elements that helps determine the tone of the scene without the help of music. It also gives viewers the opportunity to understand that sound is a valuable element in Film, because without it, we do not get the complete cinematic experience. Since this showreel features Disney film clips, I would like to remind everyone that I do not own the copyright for these film clips, all rights go to Disney. I would reccomend that you use headphones to listen to the audio featured in the video so you can hear all of the audio featured in the Showreel.
1 note
·
View note
Text
Hi everyone! This blog is an explanatory gameplay demonstration of a 360 Virtual Reality game experience which I programmed. My theme for this project is a water sound therapy session. This projects purpose is to help people take some time out of their day to listen to the sounds of water to help them reduce their stress and tension on their minds and bodies. And hopefully after people have played this game, they will be inspired to go out into nature and find other water sources to listen to. For this project I have used the game engine, Unity, to program this project. In my previous blog video, I used the middleware software, WWise, to apply all of the audio to the game engine. For this project I am only using Unity. I have applied all of the audio used in this project directly into the Unity game engine. I’m working with ambisonic audio and 2D audio. Ambisonic audio is multichannel audio that is supposed to be heard from every direction. whereas 2D audio is heard only when it is directly in front of you. The ambisonic audio files in this project are the audio files that contain the sound of water flow and motion. The 2D audio files in this project are the music, hotspot button, and invisible audio files. In this blog video, I will explain how I programmed this project whilst playing the game. This will give you guys the opportunity to listen to how the audio sounds and see what my 360 VR experience looks like. I will briefly mention C# scripts that I have programmed, and I will explain what they do to some of my game objects featured in the game. I will also mention how I made the water audio files sound like they can be heard from every direction when I move the game camera.
0 notes
Text
Hiya! Today’s blog is about how I programmed audio into a game engine template. The game engine that is being used for this project is called “Unity”, and it is a well-known game engine format that has been used to program games such as “Pokémon GO” and “Hollow Knight”. The game engine template is called “The Courtyard”, it is supposed to be a multi-level game that takes place in a sci-fi fantasy world. In the first level of the game, the player is allowed to explore a temple that is located deep within the desert, and they find that there is very little life in that temple, but at night the temple comes to life. When the player interacts with an “Alien” non-player character, they are given a task to locate three stones which will give them an extraordinary power which will help them complete the ultimate quest of the game which is unknown at this point since the game is not fully developed. In order for me to program audio into this game, I had to use a middleware software called “WWise” to install, edit, mix and master the audio before I programmed the audio into the Unity game engine by using C# coding which I programmed using Visual Studio. There are five key audio containers required for the games audio to be installed, those elements are an “Ambience” blend container, an “FPS Controller” actor mixer, an “Emitters” blend container, a “Dialogue” actor mixer, and a “Music” switch container. All the audio required for those sound elements are installed, edited and had plug-ins such as Reverb added to them. Some of those audio files in the sound elements needed additional elements such as switches, states, game parameters and attenuations which will adapt these sounds to be better suited for a 3D game. Once everything was mixed and mastered, I created play and stop events for all the containers and audio which were added to soundbanks and were generated onto the Unity game engine. I also added additional components to the game engine such as captions for the dialogue, a counter text which tells me how many stones I have to collect, and a dropdown menu that allows me to change the dialogue language since I wanted this game to be a multi-language game. The languages that I have chosen for this game are English and Spanish, and the reason why I chose them is because they are the most commonly spoken languages around the world. In this video you will see the final product of my audio programming in the game engine, and I will show you what a middleware software looks like as well as what C# coding for a video game looks like.
1 note
·
View note
Text
If you don't have an account on tumblr but you want to keep upto date with my content via notification, subscribe to my youtube channel.
3 notes
·
View notes
Text
Hiya! This is my first post on my new blog. This is a quick demonstration of programing sound into 3D game on a game engine for one of my university assessments. Unfortunately there was some errors within the game engine. But, I do manage to give a very brief explanation of how audio programming works for games using this game level called “The Courtyard”. I programmed this game level using a middleware software called, WWise, to install and edit all the audio files and I then integrate the WWise project into the well known game engine called, Unity.
3 notes
·
View notes