Don't wanna be here? Send us removal request.
Text
Final Reflection
What were your thoughts about your research topic going into the first week of the semester?
My research topic going into the first week fo the semester was very optimistic. At that point I had an idea on which AI software I wanted to use, and where exactly I wanted to use them. As weeks went on though, I began to realize both how many AI tools there were, as well as how tedious some of them would be to implement into the project. That is when I decided to take a step back and look at the most basic tools in the most simplistic and logical ways.
We are quite literally watching a technological boom happen all around us with AI. I always had the understanding from the beginning that AI is both a tool and a weapon, so its up to me to figure out how can I utilize them to further narratives or open different techniques to filming.
What are your thoughts now about your research topic and your project/practicum?
I now have a better understanding of how I can utilize these AI tools in future projects moving forward, but to get to the end of this presentable in one piece took a physical and even mental tax on me. During the project I had my doubts about getting to the end, wondered if I would make it in time with how stressful the production aspect of things were coming along.
Not only was the production very hellish and grueling, but I even had ideas towards the end of the semester what I could have done for a presentable instead of finalizing everything into one single short film.
One idea that crossed my mind about 75% of the way through production was that I could have provided multiple shorter separate presentables (under 5 minutes) that would utilize AI in different ways. For instance, one presentable could have shown AI being used to create a storyboard, and having a comparison between the storyboard and the final product. Another presentable could have revolved around style transfers and AI's ability to stylize footage. Essentially things that would be easier and way less stressful than the final product I churned out would be.
In the end though, I was happy with the presentable I did. I wish the results turned out better than I was hoping, but the project forced me to push myself, and allowed me to meet a few amazing people that I want to continue working with in the future on other amazing ideas.
What did you learn about yourself as a creative through this journey?
I knew how people felt about this topic of AI, and how touchy it might be for people within this community, but I found a lot of support from many local filmmakers moving forward. HFC was one place that had members who definitely added to the optimism. It taught me that I am never alone in these ideas and that this passion to ask questions, be curious, and experiment with new technology should be something that I continue to do. Regardless if people agree or disagree with me, I should continue to stay curious and passionate about the things I'm interested in, because somewhere out there lies a huge group of people who wonder the same.
What did you learn about the creative and production process?
The creative process was hard. While pre-production was a breeze, having to work around volunteers schedules and even sometimes crew schedules meant that in the end we only had a limited amount of days, and even then weather was not a consistent factor. It taught me that I need to level my ambitions for what the semester can handle.
What advice would you give new senior capstone students?
Make sure that when you are venturing in uncharted territories, if you decide to dive into something new for you, make sure you don't set your standards so high, or make sure that your goal is obtainable under proper conditions.
Where to now?
My goal is to always improve my craft. This project to me was an experiment and I have other things that I want to experiment on outside of the AI realm (like infrared, or Canon Magic Lantern software). But I do think that playing around with things like AI, and even doing the research before hand helps me to better understand the whole controversy behind it, and fully understand what makes it a valuable tool or a dangerous weapon.
1 note
·
View note
Text
Looking Back on my Project
Looking back and seeing how my project has evolved has thrown me through so many loops that I am surprised I got this far. For starters, my initial thought for this short film was to utilize EBSynth to stylize my film. Through my testings that shown to be a lot more tedious than I was expecting. Another thing that proved to be a huge hurdle was the amount of tools that were seemingly becoming avalible for the public to use and try out. As more things came out, my imagination became more convoluted. In the end, I chose to focus on just a few AI tools that everyone can use (Midjourney, Runway, and Adobe tools) right now in thier own films.
My casting also proved to elevate this project into a higher form. Origininally I was expecting college-students to be interested. This idea that I can use people who were still trying to get there start in acting or other forms of media would allow me to not treat things as professionally. That all changed when I sent out a casting call and people from the actual industry started to respond. It forced me to do more work on the planning side of things that I didn't forsee myself doing, but in the end I was able to adjust and adapt.
Now, on my final stretch, I still need to get one more scene due to the terrible weather that has plageued us in the past. Now, I am in the long-stretch of things and as the semester comes to a close I have nowhere to go but forward to see this projects completion. Good luck with this final week everyone!
2 notes
·
View notes
Text
A Helpful AI Tool for Anyone in a Pinch!
This past weekend, we shot at Liliʻuokalani Botanical Garden. This decision came from the fact that it had a very beautiful landscape and even a waterfall that looks on par with the scenario that I was going for. Unfortunately the biggest pro of the location became one of the biggest cons as well.
The waterfall itself looked great on screen, but the incosistent level of white noise that showed up in the audio. My solution to this was to try and see if Adobe Podcast's Enhance Speech AI Cleaner could help remove some of the background noise... Spoiler Alert, it didn't.
I thought I would need to ADR the audio until I came across this one Reddit User who claimed to have created his own Neural AI that removes background audio fairly easily. I tried it and I was completely blown away by the results. I'll attach all the audio clips for you to take a listen for yourself, as well as the link to the website if you are looking to try it out on your own project! Link: https://audo.ai/noise-removal
ORIGINAL AUDIO QUALITY
ADOBE AI "ENHANCED" VERSION
AUDIO.AI NOISE REMOVAL VERSION
4 notes
·
View notes
Text
Tumblr would only let me add one video so here’s some of the BTS from our practice session
2 notes
·
View notes
Text
Shoot Day 2 - FIGHT SCENES
Man it was a long day but we got a HUGE brunt of our scenes out of the way


2 notes
·
View notes
Text
Shooting Update #1
This past Sunday was the first day of filming the live scenes of my capstone project… And it didn’t go as smoothly as I thought it would.
Firstly when creating the call sheet for the crew and actors, one thing we are responsible of doing is making sure the weather is the right conditions for the scenes we’re shooting.
Over the past few days the weather apps said all throughout that it would be partially sunny, with 0-12% chance of rain… Unfortunately that was not the case. Sunday’s weather offered some of the worst rain I’ve seen on a set. Mixing that with dirt paths and inconsistent pockets of sun meant that bringing equipment in/out and keeping both the equipment and the actors dry was a harder task than not.
On top of that one of my actors came to set late. I set call time to 8am for the actors, but because of the horrible weather I pushed it to 9:30am. While my main actress came on time, the supporting actor failed to show up until almost 10:30 am. While waiting for him however, we were able to pump out a few scenes without him, as we got closeups of the main actress to suffice.
The supporting actor was also scheduled to leave before 1:30, so we had to try and pump out everything before then.
Originally we were planning to shoot two scenes, but due to the weather it was pushed to only one. We only did about a third of what we planned to do that day, and it really put a damper on things.
Because of this, it forced me to cut one scene, and focus hard on simplifying other scenes.
A huge damper for sure, but something I believe I can recover from using a bit of extra planning.
7 notes
·
View notes
Text
AI Blending with Live Action
“Okay Matt” you may be thinking, “you’ve shown us how you’d be able to implement AI into your short film, but do you know if that works?”
Great question, and it’s something I wanted to make sure of prior to actual filming. That’s why last week I went to my first shooting location to get some footage to ensure that what I shot would work with the AI backgrounds that I have… And truth be told? It looks REALLY good.
Here are some examples of what I mean:
This is the image that I took from my set location; aka Aiea Loop Trail.
This is an AI image I generated of a destroyed house, one that will be used for a set extension on my shoot date.
And this is both the foreground from my footage blended with the background extension. I left it ungraded to help represent where the background starts and the foreground ends.
And this is the final image blended together! Of course, it needs that color grade to blend the whole thing into one harmonious image, and it would be more presentable with a subject in the foreground but scouting by myself had some limitations. But for a test, I think the results turned out VERY good! Up next, actual BTS of my shoot! Here we go! T-minus 3 weeks until the end of the semester, we're in it for the long run now!
2 notes
·
View notes
Text












This past week I ran a table read with my actors. This is an audio log for anyone who is interested. I will also include my script for anyone who is interested in following along.
1 note
·
View note
Text
I realized I haven't shared much of the generative art that has been used in my pre-production. Here are images from my storyboard, which I created using MIDJOURNEY. None of these will be used in the final project (except for the burning house), but was vital to my storyboarding process to help visualize scenes, compositions, and overall act as a mood board
4 notes
·
View notes
Text
From my Weekly Engagement
Being mid-production of my final presentable, one thing I need to be more aware of in the future of general production is understanding limits of my actors, my production, and just my overall ambitions.
Many people know the story of Icarus and his journey too close to the sun. Unfortunately for me, my capstone started to mirror that.
My goal of this production was honestly very ambitious. So much to the point where I was looking to try and do all my dialogue for the short film in Japanese. My hope was to find cast who was able to fluently speak in Japanese, while at the same time being able to communicate and direct them in english. I shot myself in the foot with that request as it began to make me question “was it worth it?”. Subsequently, when I did find people who were willing to be a part of my production, Japanese was not one of their languages. Even people who were speakers that I asked to help me proof-read my script were not entirely 100% fluent, and could only help me so much. Ultimately I had to dial it back, as the closer I am to the end of the semester, the more I realize that my vision for something bigger was a little too close to the sun.
Fortunately for me, this wasn’t the end. I have found a great cast of people, all of which are talented actors, and they have been able to guide and help me out as much as they could; calling in favors to other friends and meeting with people who did know the language to at least bring some sort of authenticity to my project. I am very thankful for their passion and ambition with this project, even though at some points it sounds very unachievable, but that is sort of the point; to push myself into this unknown area and figure it out. That’s the only true way to build character.
I am learning that even though the focus of my capstone is to try and allow AI to help me as a tool, that there are still so many other factors and aspects that I just need to do, no hesitations. With this moving forward, I aim to rehearse with my actors, finish gathering my props, and make sure everything is smooth as possible for when we do finally start our principle photography, and from there we will move forward.
4 notes
·
View notes
Text
Progress Update 2/27/2024 - 3/12/2024
This progress update will be short, but quick. I’ve finally hit production of my capstone presentable but it’s not without its (extremely drastic) hurdles.
For starters, my casting call took a bit of time for replies to come in, but things started to work out as I began working with my actors on my film. One hurdle for this, and a reason that I believe not many people replied is my vision to have the characters speak in Japanese. The goal of having actors reprise their lines in a foreign language also heavily scared me when it came to my script.
Although I used multiple translation AI’s to get my script into the language, my fear of it being inaccurate lead me to check it with supervisors who might be able to correct the script. This turned into a task that lasted a lot longer than I anticipated, as the people I knew who were fluent in the language weren’t 100% fluent, or able to read Kanji. I plan to work around these parts that aren’t 100% set in stone.
Another obstacle that happened during the pre-production of my script, involves deliveries and wait times. I started to order the props and other equipment required for my production. Even though the shipment date of all the props were given with a 2-week delivery date (everything was expected to arrive by 3/6) Amazon started giving me delayed shipment notifications, alerting me that things wouldn’t be delivered till 3/21 at latest. As of now, most of the items came in, but things such as swords (a vital part of my film) have yet to arrive.
This wasn’t the end however, as the person I casted as my main actress seems very devoted to the project aside from the fact that she isn’t fluent in the language. Her collaboration has allowed me to change the script a bit to have both English and Japanese segments, as to give a leeway to the script. This also helped me flesh out the characters backgrounds and personalities as it helped me envision them better.
Meanwhile while I waited for things to be done, I began to work on other establishing shots, including one that I think turned out really well. In my script, I wrote that there would be shots of a burning house. After combining elements from Midjourney and Runway, and further comp’ing the results in premiere and after effects, I was able create a very good looking scene of arson in the forest.
In the meantime, I am prepped to start filming and I hope to be able to go over things with my actors prior to filming.
Until next time!
2 notes
·
View notes
Text
Questioning the reason for it all… A mental note
While the production side of my capstone is now underway, I’ve been contemplating on the question that my project is truly setting forth.
Originally it was to see how AI could assist in the development of Student Work. But that quickly changed after the SAG-AFTRA strike. The way the studios were quick to try and replace creatives, is NOT something I was behind. Replacing creativity and people’s livelihood with generated content is not what I envision the future of filmmaking to be.
From there the question transitioned to “Can we use it responsibly, without impeding or replacing artists and creatives?”. While it does fall in line more with what I aim to accomplish, the truth is, AI is both a tool and a weapon. It all comes down to how it is used within the scope of the project. So while it is a valid question, the truth is that anything, not just AI can be used responsibly.
Now that I’m looking at it, I think the definitive question is clear about what my capstone is trying to answer when it comes to the morality of it all. The real question is really “As a student creative, is using AI in a responsible manner worth it?”. In this question my presentable will stay the same, it’s proof that it can be done. But after it’s all over I should be able to answer the question of “yes” or “no” respectively, using the production of the presentable as my example for why it is/isn’t worth using AI at all.
Just food for thought.
2 notes
·
View notes
Text
An example on how I am using ChatGPT for my script
I would like to present an update on how I choose to implement ChatGPT within my presentable. While it would be easy to just tell ChatGPT to "write me a script based on ___" I feel that there are a number of factors that would not only present a surface-level project, but also one that lacks creativity as well. As I stated in other updates, I want to treat ChatGPT as an assistant. In this case, I stumbled across a cultural question when writing my script. For context, one of my characters referred to another as "Lord", and I wondered if the use-case was appropriate as a title, since Lord carries multiple meanings in different cultures.
Instead of just describing the workflow, I thought I'd present the conversation I had with Chat GPT in full. ----------------------- CHAT GPT CONVERSATION ---------------------
MATT: In Japanese culture, is it common to have Lords?
CHATGPT: In traditional Japanese culture, particularly during the feudal period from the 12th to the 19th century, lords played a significant role in society. These lords, known as "daimyo," were powerful landowners who controlled territories throughout Japan. They were essentially feudal lords who held significant political, military, and economic power within their domains.
Under the daimyo were the samurai, who served as warriors and administrators, sworn to loyalty to their lords. The relationship between a lord and their samurai was a central aspect of the feudal system in Japan.
While the feudal system and the role of daimyo have largely faded away with the modernization of Japan, aspects of this historical system and the cultural reverence for hierarchy and authority still influence Japanese society to some extent today. However, in contemporary Japan, the term "lord" is not commonly used to refer to individuals holding such positions of power. Instead, Japan has a modern democratic government with elected officials and a constitutional monarchy.
MATT: Did japanese workers, often warriors, consider their bosses as lords?
CHATGPT: In feudal Japan, warriors, such as samurai, were typically bound by loyalty to their lords, known as daimyo. This loyalty was a fundamental aspect of the feudal system, where samurai served their lords in exchange for protection, land, and status. The relationship between a samurai and their lord was characterized by a sense of duty, honor, and obedience.
While the concept of a "boss" in the modern workplace context is different from the relationship between a samurai and their lord, there are some parallels. In contemporary Japanese culture, there is a strong emphasis on respect for authority and hierarchy in the workplace. Employees often show deference and loyalty to their superiors, and the relationship between workers and their bosses can sometimes resemble the dynamics of feudal allegiance.
However, it's essential to note that the modern workplace in Japan operates within a different social and legal framework compared to the feudal system of the past. While there may be elements of respect and loyalty reminiscent of the relationship between samurai and their lords, the dynamics are not identical, and the terminology and expectations have evolved to suit the modern context.
MATT: could anyone be a daimyo, or did daimyo need to have political power?
CHATGPT: In feudal Japan, the title of daimyo was typically conferred upon individuals who held significant political, military, and economic power within their respective domains. While there was a degree of hereditary succession in some cases, particularly among powerful families, the ability to become a daimyo was not solely determined by birthright. Factors such as military prowess, wealth, connections to the ruling shogunate or imperial court, and alliances with other influential families could all contribute to someone attaining the status of daimyo.
Furthermore, during periods of civil strife or instability, individuals with the military strength and political acumen could rise to power and establish themselves as daimyo through conquest or alliance-building.
In essence, while there were certain societal structures and expectations that influenced who could become a daimyo, the position was not strictly limited to individuals with inherited political power. Successful daimyo often demonstrated their ability to govern effectively, maintain order within their domains, and navigate the complex political landscape of feudal Japan.
MATT: could someone who just had a lot of money be considered a daimyo?
CHATGPT: In feudal Japan, while wealth certainly played a significant role in one's ability to exert power and influence, simply having a lot of money would not typically be sufficient to be considered a daimyo. The title of daimyo was not solely based on wealth but rather on a combination of factors, including political power, military strength, land ownership, connections to the ruling authorities, and social status.
Many daimyo came from noble families with long-standing histories of power and influence. Others rose to prominence through military prowess, alliances, or strategic marriages. While wealth could facilitate these endeavors by allowing individuals to raise armies, fund construction projects, or bribe officials, it was not the sole determinant of daimyo status.
However, it's important to note that during periods of upheaval or transition, such as civil wars or when central authority was weakened, individuals with significant wealth could sometimes leverage their resources to acquire political power and establish themselves as daimyo. Yet, even in these cases, they would typically need to demonstrate the ability to govern effectively and maintain control over a territory to be widely recognized as daimyo.
MATT: I am creating a short film set in Japan and would like your advice using this information. In the story, someone is trying to meet with a character, who is of high wealth due to his past as a bounty hunter or samurai. However, he has put his past behind him and is trying to live peacefully. The person who is trying to see him is stopped by a hired guard. Would this guard refer to the character as "lord"?
CHATGPT: In your short film scenario, the guard's manner of addressing the character could depend on various factors, including the guard's own background, the cultural context of the setting, and the specific relationship between the character and the guard.
Given that the character is portrayed as having a high status due to their past as a bounty hunter or samurai, it's possible that the guard might show a certain level of deference or respect when addressing them. In the feudal context of Japan, individuals of high status were often addressed with honorific titles such as "lord" or "master."
However, since your character is depicted as having retired from their past life and is attempting to live peacefully, they may prefer not to be addressed with such titles, as it could remind them of a time they are trying to leave behind. The guard may also be aware of this and choose to address the character in a more neutral or respectful manner, using their name or a respectful term like "sir" or "mister."
Ultimately, the way the guard addresses the character could serve to highlight their respect for the character's past accomplishments while also acknowledging their desire for a peaceful life. This can add depth to the characters and help convey the themes of the story effectively.
MATT: So would it be more accurate to call him "Lord" or "Master"?
CHATGPT: In the context of feudal Japan, "Lord" or "Master" could be appropriate titles for addressing someone of high status like a former samurai or bounty hunter who has retired. These titles convey a sense of respect and deference to the individual's past accomplishments and status.
However, it's essential to consider the character's personal preferences and the dynamics of the specific situation. If the character has put their past behind them and prefers not to be reminded of their former title or status, the guard might opt for a more neutral or respectful term like "Sir" or simply their name.
Ultimately, the choice of title should reflect the relationship between the character and the guard, as well as the tone and themes of your short film. You might consider exploring how the use of different titles impacts the interactions between the characters and contributes to the overall atmosphere and narrative of the story.
MATT: In this story, the guard would have respect for the character. Based on this character's personality, I don't think anything that reminds him of his past would phase him, as he already lives with the guilt of what he has done.
However I do believe "Master" being more neutral would be more structurally correct.
CHATGPT: Given the character's personality and the guard's respect for him, using "Master" as a title for the character would indeed be a suitable choice. "Master" carries a sense of respect and formality without necessarily evoking specific connotations of past roles or titles. It's a neutral term that acknowledges the character's status and authority without necessarily highlighting their past.
Using "Master" in this context would align with the guard's respect for the character and their acknowledgment of his position without overly emphasizing his past as a samurai or bounty hunter. It allows the guard to show deference and respect while also maintaining a level of neutrality and discretion that might be appropriate given the character's desire for a peaceful life.
Ultimately, the choice of title should serve the story and characters in a way that feels authentic and meaningful to the narrative you're creating. If "Master" feels structurally correct and aligns with the themes and dynamics of your story, then it's a suitable choice for the guard's manner of addressing the character.
MATT: Thank you
CHATGPT: You're welcome! If you have any more questions or need further assistance as you develop your short film, feel free to ask. Good luck with your project!
--------------------------------------END----------------------------------- Make sure to double-check your responses, as ChatGPT isn't always 100% correct. And don't forget to thank your AI in case of a robot uprising 😜
Until Next Update!
2 notes
·
View notes
Text
Opening Update
Just a small update. I made an opening based on 1950’s TOHO intros.
I found a blurred background and used an image to video AI (Runway) to give it movement. I then created this logo in premiere, and added film grain and a wiggle effect in After Effects to give it that old-school gate weave.
I translated English to Japanese using ChatGPT, and double checked the translation with some bilingual friends for authenticity.
For comparison, here is the original Toho logo that inspired the look.
youtube
1 note
·
View note
Text
2/4/2024-2/18/2024 Progress Update
I just wanted to go ahead and get a head start on the next progress update. These past few weeks had a lot of interesting things. Firstly I started and finished most of my script for the project. I have the beginning and ending fully fleshed out, at least how I want it. My hurdle is in the middle, how I’m going to connect the two parts together. This is where I will be using ChatGPT as a virtual assistant. I will explain to it what I have already, and ask for ideas on where I can take the story from there. My goal is to not have it write specific scenes, but to at least give me scenarios in which I can apply to the script myself.
For context, my short film will be styled in an old Japanese samurai film, but with futuristic elements to it.
I already used ChatGPT to come up with names for my characters by describing their personalities and the nature of my script. The responses it gave me, and the reasoning behind the names turned out really great.
Alongside starting the script, I have also been practicing more with the AI tools. This time, instead of doing something unrelated to the project I built out a scene from my script, which I’ll link below.
I filmed the ship practically, once again using a model that I found on Amazon. I combined it with the use of 4 AI tools; Midjourney, Pixverse, Color Match.Ai, and Suno.ai.
I created the background using Midjourney, something I went over in my previous progress update. The shot I came up with is a top-down shot of a forest. From there I threw it into Pixverse, a free AI website that adds movement to photos using depth mapping (something I also went over in my previous update). The result was great but the final result was only 8 seconds long, so I slowed it down in premiere. It’s longer, but the movement in the image is now less noticeable.
Shooting the model practically ended up being a bit rough. Firstly, propping the camera to be above the model (whether actually placed above the model, or turning the model on its side) proved to be a hassle. C-stands would have fixed this issue, but the ones I ordered prior to the new year still never to shipped.
I used to own a slider but the person I lent it to has not returned it. A lesson for the future about loaning out equipment. To work around this, I Jerry-rigged a dolly to a tray so I can pull the model and the green screen.
The results came out clunky and rotoscoping in After Effects started to get rough with the finer details, so I ended up freeze framing it in premiere, and adding the motion manually through key frames.
Colormatch.ai is a software I invested in a few years back, it works by importing a still image of something with color grading you’d like to try out, and it exports a similar result for use in your own projects. Since one of the inspirations to my project is Akira Kurosawa, I wanted to replicate the black and white methods used.
But why use this instead of just pulling the saturation down? One thing about Kurosawa’s stuff I’ve noticed is the halation and sharp contrast that his films have. I wanted to try and replicate that as best as I could. The end results weren’t perfect but it’s a good starting point.
Suno.ai is similar to Midjourney, but for generating non-copyrighted music. The results aren’t perfect and there’s a lot of artifacting in the songs, but for something that matches my theme I think it turned out quite alright for now.
To solidify the scene, I added some fog effects and speed-ramped it as the ship flies by.
In the end, I came up with this result after about 2-3 hours of messing around to see what works, and about an hour and a half of execution on a “final”product. As I near further into my production I hope to flesh it out even more.
I hope by the end of the next update, I will have finished the script, storyboard, and shot list.
After that, I will be looking to translate my script into Japanese for authenticity. Then I’ll be onto looking for talent and possibly crew to help me out.
I know I promised a breakdown on EBSynth, and I promise I’ll get that out soon!
Until then, here is the scene that I have created while testing everything out. It’s short but hopefully by the end of it all will be a part of a greater scene.
2 notes
·
View notes
Text
1/28/2024-2/3/2024 Progress Update
This week was somewhat productive on my capstone pre-production.
One of the things I wanted to do was experiment and play with the use of multiple tools in conjunction with each other, mainly Midjourney and Photoshop AI. I wanted to see if I can create a background using Midjourney, based on something I filmed in real life, and stylize both the subject and background to help blend the two evenly using Photoshop AI. I would then use EBsynth to apply the look to the rest of the scene.
For those who don’t know, EBsynth allows users to stylize only a few frames of a video the way they like, and apply the style to all frames within a sequence. Its goal is to shorten animation time for individuals who need to painstakingly animate everything frame-by-frame.
To test everything out, I first filmed a toy spaceship on a green screen, and exported it through Premiere as a PNG sequence for EBsynth. The footage was rough, but enough to work.
Choosing one of the frames from that sequence, I sent it over to Midjourney to create a background. It looked good, but the ship came back warped by Midjourney’s interpretation.
With that however, I took it over to Photoshop and removed the warped subject using their content aware AI to make a blank background. I then brought in the original source image I used as a reference for Midjourney, and removed the green screen so only the ship remained. Using Photoshop’s Neural AI Filters, I applied a style to both layers individually to create the image I posted below.
I then repeated the process on different frames, removing the green screen, adding the background, and stylizing different angles of the ship so EBsynth had more reference points for the final project.
So far, it sounds like everything is working well. However, issues occurred was when I sent everything to EBsynth and compiled it in premiere pro for a final look. The ship came out wonderfully, but the background kept glitching due to the fact that I used the same static background for all the frames. The fact that there was movement in the foreground confused EBsynth, so it tried to make the still background move as well, causing it to distort and artifact.
To fix this, next time I will need to remove the subject from the green screen in After Effects before exporting as a PNG sequence. That way the green screen will be removed from the subject and I’ll be left with a single object. Using a HuggingFaces AI tool called ZoeDepth, I’ll able to get the background I made, and turn it into a 3D background using what’s called depth-mapping. Then I can apply the background separately through Premiere, or After Effects and add movement to it myself, removing the horrible artifacting that was being caused by EBsynth.
As I keep researching, i hope to continue experimentations in the future and find something that works. I have found other means of AI tools in filmmaking, including a blend of Stable Diffusion, ControlNet, and Blender. The learning for it seems pretty steep, but if it’s something I need to learn, I am open to accepting that challenge.
Until next week!
4 notes
·
View notes
Text
Progress Update #1
Aloha everyone,
Just as a recap for those of you who stumble across these notes but don’t know what the goal of my final deliverable aims to achieve; in a nutshell, I am trying to find a way to use the upcoming presence of open-source AI production tools to enhance my own independent creations. You can find out more info through my introduction video, including my stance on the use of AI in the industry.
For the past two weeks, I have been actively prepping for the pre-production phase of my Capstone Deliverable. First I started with a discussion with Sharla on my final presentable. It is something that I am still trying to iron out, but I do believe that there has been a lot of progress that has been made. This also started the beginning of my workback schedule, which I continue to flesh out over the course of the next couple weeks.
To start this project off in terms of work though, I created a few ideas for short films that I’d like to try, and listed them out. My job for this moving forward is to establish which one I would like to use. One thing that I am trying to keep in mind with these ideas is how I will be using these AI tools to enhance them, and if they are something that would benefit from the assistance of these tools.
One of the things that I am aiming to do in these early areas is to continue experimenting with some of the tools at my disposal right now. It’s a good practice to keep myself up-to-date with the new and upcoming tools, but also better understand the process of the tools I will be using during the production phase of the project. This mindset has already paid off as I came across a new tool I never heard of called ZoeDepth. This AI tool allows me to get 2d images and turn them into 3D backgrounds with depth. For example, I could shoot a scene that would require a set extension, and use ZoeDepth to help me so I wouldn’t have to rotoscope movement into the background. By utilizing this tool, and hopefully providing examples in the future, I will be able to implement them much easier into my final deliverable.
Another thing that I did during this last week was sign up for Corridor Crew’s Crewtorials. This is a Masterclass held by the production crew over at Corridor Digital, and goes over how they specifically used AI tools to help them in creating their “Rock Paper Scissors Anime”, which is one of the catalysts for me to want to dive deeper into this project. I originally felt that I wanted to stray away from the complex nature of that project, but felt that I should still learn to understand it in case my final deliverable requires it.
Now moving into the later half of the first week, this is where things began to get rocky. One of the hurdles I had prior to starting the semester was a surgery scheduled which I knew from the start would cause some issues in the beginning, but I planned to trek through regardless. The first few days of recovery were rough, as I focused more on the recovery than the pre-production. Hopefully Week 3 can still yield some results as I progress.
2 notes
·
View notes