majorprojectmwang
majorprojectmwang
Practice 3
42 posts
Don't wanna be here? Send us removal request.
majorprojectmwang · 1 year ago
Text
Reflection and Evaluation 02
I would like to explain the feedback situation this semester. I usually provide feedback to the teacher every Thursday and record and modify each feedback. For regular self-assessment, I think there are some things. Whenever a small stage is completed, I will make a simple summary. In the bus simulator project, I excessively added many elements without fully considering time and experience. Therefore, when making plans, all possible things should be taken into account. Every time I give feedback to the teacher, I also add some questions, including technical issues. Academic and academic issues. I also hope to achieve relatively stable results in the final stage of the project and hope to break through myself.
I have summarized the errors that I frequently encounter. The first issue to consider when creating a scene is the proportion. The ratio of buildings to people needs to be constantly compared. The second is the reference images. Generally, I have found many references, but some of them are not completely suitable and need to be modified, which increases the work time. Thirdly, if there are problems with scene debugging and I don't know how to fix the deadlock, the solution is to search for answers on Google and YouTube step by step. Fourthly, making backups is a relatively critical aspect of the project that I did not prepare for until halfway through the project. This can ensure that the project can be remedied in case of damage.
For the mentality of learning motivation, I feel that my mentality has been relatively stable during this period. I basically did not go out for activities in the third semester and devoted all my energy to projects. I believe I am maintaining my interest and enthusiasm to complete this project. Sometimes when encountering very complex problems, the pressure caused by it can be quite significant. At other times, I feel conflicted. My solution is to go for a walk or eat something to keep my brain fully rested.
Tumblr media
I would like to review the new skills I have learned, starting with the modelling tools of UE5. In the new project, I feel that I can master these tools more proficiently. The second is OSL shader, which can be written to quickly apply shaders to the scene, achieving the best results. The third one is the blueprint of the bus simulator. Although there was not enough time to conduct a comprehensive exploration of the bus, I used a car instead of the bus to create the blueprint, thus achieving the acceleration, deceleration, turning, parking and other actions of the car. The fifth is the rainwater sticker material. Thank you very much to Cem Tezcan, the author of the third-party asset. This is really a great tool, I decomposed its shader to explore. Although I couldn't completely replicate his shader, I learned a basic rainwater decal material method from him.
Conclusion
This project has deeply touched me in my heart. Especially this way of thinking is very habitual, which has taught me a lot. I hope to gain more experience and knowledge in my future job position. My next stage should be either finding a job or continuing to learn skills. I should make the right choice.
Reference
Unreal Engine - Rain Material Volume on Foliage(Cem Tezcan,2018). Available at: https://cemtezcan.com/projects/b5Wz8d(Accessed 13 August 2024)
0 notes
majorprojectmwang · 1 year ago
Text
Reflection and Evaluation 01
Today, let's summarize the process of my experience. Looking back three months ago, when I started this plan, I established my own goals and planned to complete the bus simulator project within three months. Therefore, I made a detailed Gantt chart and controlled my learning time. A clear goal is to establish one's own direction. My main goal is to create a map of a small European town. Looking at this map now, it feels like there are only minor differences from the originally planned town, and most of the situations are in line with expectations. Below, I will mainly evaluate which goals I have achieved or have not yet achieved:
The main planning of the map is set according to the original plan to achieve. Completion time: 1 week
French iconic small house to achieve, 3 weeks
The road surface design plan was achieved, within 3 days,
The main building, including a shopping mall, hospital, hotel, park, school, and train station, has been completed in 3 and a half weeks.
Design and production of a small house, completed in 5 days
Add ornaments and vegetation, completed, 1 day
Add lighting (including scene lighting in different weather conditions), complete, 2 days
Add different weather effects (including nighttime and rainy weather effects) to complete, 1 day.
Add bus simulator blueprint (research and learning) completed in 3 days
Video post-production editing and rendering, completed, in 1 day. Unfinished goals
Add amusement park scene, incomplete, abandoned due to insufficient time plan.
Exploring the principles behind the production of bus blueprints is incomplete, increasing the difficulty and requiring more time for research.
Plan to add traffic blueprints, including automatic car simulation systems, and traffic signal systems, which are incomplete, too difficult, and beyond the scope of environmental art, requiring sufficient time for research.
The scene can be further refined because it is too large and time control is insufficient
Secondly, I would like to organize the learning methods I have used. Firstly, regarding the selection of learning resources, I mainly learn through online courses, as well as some e-book websites and third-party asset learning and research. Overall, it is quite diverse.
Tumblr media
I basically take blog notes and organize my notes every week. In the direction of making shaders, in order to check whether my shader is effective and whether the effect meets expectations, I need to test it multiple times every time. I have also tried other learning methods. I often communicate with netizens or classmates about some issues, which can increase my memory. I think my learning method is to first have a strong interest in the industry to generate active learning. For the major I am currently engaged in, it is still a combination of learning and practice, and I need to constantly summarize my own problems. Resolve existing issues through online videos or teachers. For the review of time management, although the project was basically completed on time, I feel that the total amount of work in the selected project is too high. Due to the large amount of project assets, it is very likely that one's own assets cannot be completed or the quality is average. I think for future projects, I need to consider the time I need, which must be sufficient rather than ambiguous, in order to ensure the quality of the product. There are situations during the learning process that do not align with the plan. Some knowledge cannot be completed within one's own set time, which leads to procrastination. The main reasons for this are one's unfamiliarity with new knowledge, immaturity when working on projects, and inadequate daily planning for one's own projects.
1 note · View note
majorprojectmwang · 1 year ago
Text
Quantum rendering and neural rendering(Forefront)
Quantum rendering is a cutting-edge theoretical technology that requires the combination of quantum computers and computer graphics technology and is also the most powerful support for future environmental art in images. Because this rendering method can break traditional rendering methods and is an excellent, efficient, and fast processing method. Quantum rendering is a highly complex rendering task that utilizes quantum computers to handle ray tracing and global illumination. Quantum rendering is even faster than traditional real-time rendering. For example, traditional videos require waiting for CPU calculations to render images, which increases the computational load and time for complex layers. Quantum computing can greatly reduce the waiting time for rendering.
Tumblr media
Quantum computers are still in the early stages of development and the technology is not yet mature enough, developers face many challenges and difficulties that need to be solved. But quantum rendering technology must be cutting-edge, with continuity and diversity, and I believe it will be in the future society. This technology will gradually cover various fields such as games, movies, and more
Reference
Quantum Computing and the Future of 3D Rendering(A23d,2023). Available at:https://www.a23d.co/blog/quantum-computing-and-future-of-3d-rendering(Accessed 12 August 2024)
0 notes
majorprojectmwang · 1 year ago
Text
Cross-media Environmental Art
Cross-media environmental art is an innovative art form that integrates different media and technologies. It combines digital art, interactive design, virtual reality (VR), augmented reality (AR), sound design, film, installation art, and other media to create a new experience. This art form breaks through traditional media boundaries and provides a more immersive and interactive art experience.
Tumblr media
Characteristics of Cross-Media Environmental Art
youtube
A multi-sensory experience, cross-media environmental art typically involves sensory experiences of sight, hearing, touch, and smell. For example, in a 4D movie, the audience can sit on a rocking chair with 3D glasses, a large screen ball screen, a rain generator, etc., to achieve an immersive experience.
Interactivity and Participation Engaging the audience is a cutting-edge design concept for environmental designers. The environment constantly changes with the participation and perception of the audience. Each audience member's participation will also have a different experience. For example, the plot in a movie changes with the audience's choices, leading to the plot evolving in different directions.
Combining virtual and real XR, MR This field is currently the expected and relatively challenging design concept. Combining virtual elements with real elements opens up a new experience that breaks through virtuality and boundaries. For example, with the development of MR glasses, people can now wear glasses to interact through visual and auditory means, such as translators and navigation devices. I believe that in the future, glasses can be combined with mobile phones to replace their functions.
Reference
Cinematic storytelling: Creating VFX for Game Environments using Unreal Engine Discover | The Rookies(Rodolfo Silva,2023). Available at:https://discover.therookies.co/2023/07/05/cinematic-storytelling-creating-vfx-for-game-environments(Accessed 12 August 2024)
0 notes
majorprojectmwang · 1 year ago
Text
VR&AR Technology (forefront)
In VR and AR, the goal of environmental art is to create a fully immersive virtual world. This requires highly intricate details, realistic lighting, and natural interactions. The use of real-time rendering technology and high-resolution textures enables these environments to be presented with higher realism.
For environmental art, creating VR environments requires several key points: The importance of details and realism: providing players with a realistic gaming experience and a more interesting exploration environment, combined with real-time rendering and terrain generation technology, to achieve the best image accuracy. The technology and tools used: to create very realistic effects, the requirements for software and hardware are very important. With the support of UE5 engine and RTX graphics card, the display effect can be displayed more quickly. At the same time, the player's physical comfort should also be considered to avoid creating uncomfortable feelings for their eyes in the environment design.
Tumblr media
youtube
I envision that in the future, environmental art design will expand in the VR field, including the design of real-time environments that will change the player's mood. The future of AI intelligence will become more mature and provide more convenient and efficient services to users in the AR and MR fields. I believe that in the future, the real environment and virtual environment will be integrated while providing users and developers with broader and more imaginative creative space.
TeamLab is a Japanese art collective known for its cross-media environmental art. They have created many interactive digital art exhibitions where audiences can interact with their works through walking, touching, and other means. These exhibitions typically combine light, sound, and motion capture technology to create immersive art experiences Icelandic musician Bjork has created multiple cross-media art projects through collaborations with digital artists and programmers, including VR music videos and interactive installations, blurring the boundaries between music, visual art, and technology.
Reference
Environmental Art in VR Games: Crafting Immersive Worlds Beyond Imagination(Jamal_Aladdin 2023) . Available at:https://medium.com/@Jamal_Aladdin/environmental-art-in-vr-games-crafting-immersive-worlds-beyond-imagination-b2b0b74d3bb3(Accessed 12 August 2024)
0 notes
majorprojectmwang · 1 year ago
Text
Ray tracing (Forefront)
Ray tracing technology can provide highly realistic light and shadow effects. In recent years, NVIDIA's RTX series graphics cards have brought real-time ray tracing into the consumer market. Real time ray tracing can simulate the path of light in the scene, including effects such as reflection, refraction, and shadows, significantly improving visual realism.
Real-time ray tracing technology is a cutting-edge technology that allows developers to adjust lighting while the scene is running. Traditional lighting technology requires opening the pipeline of the graphics card and then turning on the UE engine. As the direction of the light moves, the effect can gradually be seen. Real-time ray tracing technology is a continuous and diverse technology. It can be imagined that a car is driving on a highway and suddenly the entire body splits apart, with its parts scattered in the air. If this is a video, adjusting the lighting is a very complex matter. This may require a significant amount of time for adjustment. But with the addition of real-time ray tracing technology, adjusting the lights may be a momentary thing. When the lights are adjusted and the video continues to play, the car returns to its original state and disappears from view. Therefore, this technology is definitely avant-garde and efficient.
Tumblr media
With the development of AI technology, adjusting lighting and indirect lighting will become increasingly intelligent. Film and television post-production software will also add ray tracing functions, so the post-production process will also achieve a realistic effect like an engine.
Reference
What is ray tracing, and how will it change games?(Jon Martindale, 2023) . Available at:https://www.digitaltrends.com/computing/what-is-ray-tracing/(Accessed 12 August 2024)
0 notes
majorprojectmwang · 1 year ago
Text
Real-time is the future (Forefront)
The art of visual effects changes with technological advancements. It can be imagined that in the future, visual technology will be more forward-looking and efficient, and this technology is real-time rendering. If you travel back in time to the future, everything you see may not necessarily be real, and it is very likely that it is constantly changing. Science fiction movies interact with humans, providing audiences with an unprecedented experience. Playing games in that completely virtual world, constantly breaking through and innovating oneself.
Tumblr media
“A recent demonstration held by Epic Games showcased how real-time technology can be combined to achieve in camera visual effects in live action shooting. In the demonstration, an actor riding a motorcycle filmed live scenes on a series of LED wall panels. The images generated on the walls seamlessly integrated with the live scenery and could be changed in real-time” (IAN FAILES, 2020)
Tumblr media
It can be seen that our generation is gradually breaking through this era and moving towards a new era. Movies can be rendered in real-time, seamlessly blending CG lighting with the actor's body lighting without any flaws. The scene can be modified in real-time, with different colours of scenes having different GIs and different colours of light overflow. Moreover, actors can change the scene position at any time, truly achieving the illusion of authenticity. Most of these scenes are also used on stage. But at present, if it is live streaming, the difference between CG scenes and real scenes is still very obvious, and this cutting-edge technology still needs to be broken through again by the future generation.
Reference:
FOR COMPANIES AT THE FOREFRONT, THE FUTURE IS REAL-TIME(IAN FAILES, 2020 ). Available at:https://www.vfxvoice.com/for-companies-at-the-forefront-the-future-is-real-time/(Accessed 12 August 2024)
0 notes
majorprojectmwang · 1 year ago
Text
Forefront of Lighting
The forefront of CG lighting is reflected in four aspects. Global illumination. Global illumination can simulate the reflection of light in the scene, and the degree of subdivision of light can increase the realism of the image. The traditional calculation method requires calculating all the data in the scene during rendering, and calculating pixel points during rendering is very time-consuming and complex. With the continuous innovation of technology, the field of real-time rendering is at the forefront, and lighting technology is also constantly being updated. Lumen is a representative of UE5 lighting, which can calculate the reflection and illumination of objects in real-time, improving lighting effects faster and more efficiently.
Tumblr media
Real-time Lumen lighting is a technology that approaches real lighting, with further improvements in processing speed, frame rate, and quality of lighting. But this large amount of Lumen lights takes up a lot of computer memory, and I think further processing is needed to optimize the lighting system, which will increase the efficiency of multitasking on the computer. For some simple small projects, light mapping may be a better approach
Reference
Lumen in Unreal Engine 5. Why is it such a big deal?(Axr, 2022) . Available at:https://medium.com/@axr230102/lumen-in-unreal-engine-5-why-is-it-such-a-big-deal-843ffeccee8c(Accessed 12 August 2024)
Lumen in UE5: Let there be light! | Unreal Engine(Unreal Engine, 2021) . Available at:https://www.youtube.com/watch?v=Dc1PPYl2uxA&t=664s(Accessed 12 August 2024)
0 notes
majorprojectmwang · 1 year ago
Text
Forefront of Technical Art 02
PCG Process and Technology
Here are some information about the PCG framework I found:
“A PCG Volume is placed into a level containing a simple landscape mesh. In the PCG Graph, a Surface Sampler node is added. It uses the PCG volume to sample the landscape geometry and provide spatial information, which generates point data. Next, additional nodes are added to further filter and modify the points. For this example, the Transform Points node is used to add rotation and scaling to the points. Finally, the final result is output. In this case, the data is used to spawn static meshes in the scene — several trees.” (Jack DiLaura, 2023)
Tumblr media Tumblr media Tumblr media
I started trying to use PCG nodes to create scenes. I have to admit that I am not a professional technical artist, but I believe that mastering such relatively cutting-edge technology is very helpful for environmental artists. The function of the Sampler face node is very powerful. This node can control the density and distance of the generated object mesh, greatly increasing the efficiency of game creation. As an environmental artist, technical art is a catalyst for environmental art. Plants grow randomly in terrain, and the NormalToDensity node can control the display level of generated plants. X, Y, and Z represent different display methods, with the Z-axis value being the most prominent. The default value for the Z-axis is 1. After debugging the node, as shown in the figure, white indicates full display, black indicates no display, and grey indicates partial display. When the Z-axis value is 0.5, the area displays 50% of the trees completely.
Tumblr media Tumblr media
The Transform Point node is even more obvious. In the parameter settings of the node, I can set a range of maximum and minimum values, and within this limited range, set the random scaling, movement, and rotation of each vegetation. The Density Filter node is used to adjust the values of the Upper bound and Lower bound as a whole, allowing for large-scale debugging of the final results.
Reference
Unreal Engine Procedural Content Generation for Immersive Installations(Jack DiLaura, 2023) . Available at:https://interactiveimmersive.io/blog/unreal-engine/unreal-engine-procedural-content-generation-for-immersive-installations/(Accessed 12 August 2024)
Unreal Engine 5.2 PCG(Adrien Logut,2022). Youtube.Available at:https://www.youtube.com/watch?v=hjk9308SCeE&t=157s(Accessed 12 August 2024)
0 notes
majorprojectmwang · 1 year ago
Text
Forefront of Technical Art 02
Procedural Content Generation
The generation of programmatic terrain is becoming increasingly common in the gaming and film industries. Using terrain generation plugins can quickly generate and edit terrain. With the continuous updates of Unreal Engine, the ability of this programmatic generation is also becoming stronger. The effect is also constantly improving
Adrien Logut is a tool programmer at Epic Games, mainly researching Unreal Engine development tools. I previously discovered that he often posts his works on GitHub, which includes PCG plugins and tutorials that allow designers and game developers to interact and discuss.
Tumblr media
Logut's contributions have had a significant impact on the VFX community, particularly in the fields of real-time rendering and program generation, making him a noteworthy figure in the industry. The process and technology of PCG plugins are relatively continuous and directional, and PCG has played a better role in the UE5 engine. This is a collection of tools that allow designers and environmental artists to quickly iterate content by creating their own content. My previous experience was using it to make forests, grass, various plants, mountains, and rocks. They don't need to be repeatedly created but rather generated using nodes. Firstly, the framework is launched by the environment artist to enable plugin functionality. This is similar to the effect of creating blueprints before.
Reference
Unreal Engine Procedural Content Generation for Immersive Installations(Jack DiLaura, 2023) . Available at:https://interactiveimmersive.io/blog/unreal-engine/unreal-engine-procedural-content-generation-for-immersive-installations/(Accessed 12 August 2024)
Unreal Engine 5.4 PCG(Adrien Logut,2024). Youtube .Available at:https://www.youtube.com/watch?v=SxIXnEik2Xk(Accessed 12 August 2024)
0 notes
majorprojectmwang · 1 year ago
Text
Frontiers of Technical Art 01 Real-time Rendering02
Realtime Fractals
Real-time fractal technology is an unfamiliar term to me, perhaps I think it is a corner of the gaming and film industry. The production process of visually stunning effects, which often appear in special effects technology for movies and games, is difficult and arduous. I learned about a master named Jesper Nybroe and  Matteo Scappin here. He is a decomposed artist, and many of his great effects come from his techniques.
Tumblr media
Jesper released this Unreal Engine-based technology in 2022, and as Unreal Engine continues to upgrade, his fractal technology is also upgrading. The terrain generated by this tool produces stunning visual effects, with continuity, complexity, and effectiveness. These strange yet familiar terrains are also frequently used in the gaming industry.
youtube
I have to say that this technology is closely related to the field of environmental artists. When creating some non-realistic sci-fi scenes, this fractal tool has over 30 parameters that can be adjusted, including transformation modes. Colour control, proportion, and emission, etc. What's even more powerful is that its real-time rendering function has very low latency, and the image changes with the adjustment parameters during the demonstration, which is also due to the global illumination function in UE5. This technology fully utilizes destruction and irregularity. These novel designs are really amazing, they can even be animated. I also find it hard to imagine that static objects in life can move, deform, and split. It's wonderful, even terrifying.
These wonderful and realistic effects require powerful graphics processing capabilities such as RTX2080 GPU, and the increase in computing power of a graphics card is driving the prospects of the gaming industry.
Reference
UNREAL ENGINE 5 - REALTIME FRACTAL PLUGIN(Machina Infinitum - Fractal Experiences,2022) Youtube. Available at:https://www.youtube.com/watch?v=x3HNHkJlBqk(Accessed 12 August 2024)
Realtime Fractals in Unreal (FXguide25,2022) Available at:https://www.fxguide.com/quicktakes/realtime-fractals-in-unreal/(Accessed 12 August 2024)
0 notes
majorprojectmwang · 1 year ago
Text
Frontiers of Technical Art 01 Real time Rendering
The cutting-edge field of Technical Art is constantly evolving with the rapid development of gaming, film and television, virtual reality (VR), augmented reality (AR), and other digital content creation industries.
Real-time ray tracing: With the improvement of hardware performance, real-time ray tracing has become possible. This technology can provide more realistic light and shadow effects in games and virtual environments, greatly improving visual quality. Path tracking: More complex path-tracking algorithms are gradually entering the field of real-time rendering. Although they are still in the experimental stage, they are expected to achieve more refined light and shadow representation in the future.
Real-time rendering can quickly increase the work efficiency of technical artists and environmental artists. The forefront of real-time rendering has the characteristics of continuity, diversity, and sociality, and can also better serve developers and consumers.
With the support of hardware technology, real-time rendering has opened up markets in new fields such as gaming, film and television, architecture, and automotive. New developers are constantly developing the most efficient tools to provide to developers.
youtube
Reference:
Realtime Rendering. Available at: https://en.wikipedia.org/wiki/Real-time_computer_graphics(Accessed 12 August 2024)
0 notes
majorprojectmwang · 1 year ago
Text
Rain Decal Research 06
According to the previous dynamic image, this should be a mixture of two textures, and the method should be similar to the previous Rain Base Material. I make a hypothesis to use the two rain textures I previously used blend them with Panner animation and enable multiplication. Obtain the effect shown in the following figure.
Tumblr media Tumblr media
The first step is to add Panner movement animation to the falling texture, and the texture starts to loop down. The operation here is the same as the previous study on rain stickers.
Tumblr media Tumblr media
Next, convert the final effect into numerical values and connect them to the next Panner, which controls the animation effect of the second texture. The UV coordinates of the second Panner should be the same as the first one to maintain consistency in the effect.
Obtain the final effect as shown in the above figure. The rainwater still shows a partial falling effect. I am gradually inferring the developer's intention. I think adding Noise to the later nodes is also to break the effect of falling too quickly. By adjusting the parameters, we can obtain the display effect shown in the following figure.
Tumblr media Tumblr media Tumblr media
To make the raindrops more realistic, a mixed node approach can be used. Copy the material nodes that were originally formed, and then multiply them with the data from the previous drop map on the second node to form a complex doubled blend data, increasing the complexity of the map. Use Max to select the maximum value of two texture data, and the final result is shown in Figure 3.
Reference
Vehicle Animated Rain - Waterdrop Material & FX(CemTezcan,2018) UnrealEngine. Available at:https://www.unrealengine.com/marketplace/en-US/product/animated-rain-waterdrop-material(Accessed 10 August 2024)
0 notes
majorprojectmwang · 1 year ago
Text
Rain Decal Research 05
Tumblr media
I think this is a very good shader, and the developers have considered it comprehensively. They have made better adjustable parameter switches on the material instances, which allows environmental artists to control them freely.
Tumblr media
By reverse reasoning based on the material function, it can be seen that this is a dynamic graph. Although developers have created many controllers, the dynamic graph generated by this material function actually achieves most of the effects. According to the previous study, it can be analyzed that this should be a synthesized image and a mixed presentation of the previous rainwater decal. This is a research challenge for me.
Reference
Vehicle Animated Rain - Waterdrop Material & FX(CemTezcan,2018) UnrealEngine. Available at:https://www.unrealengine.com/marketplace/en-US/product/animated-rain-waterdrop-material(Accessed 10 August 2024)
0 notes
majorprojectmwang · 1 year ago
Text
Creating a Rainy Effect at Night
My plan is to add a rain effect to the final promotional video and present it in the later editing. There are many ways to achieve the effect of rain, and I will use blueprints that consider adding rain and stickers on the water's surface to achieve the effect.
I previously studied rain stickers and after testing, the effect can still achieve the expected effect. However, the author has made a deeper level effect on rain stickers, achieving different effects in the X, Y, and Z directions.
Tumblr media
As shown in the figure, different effects can be seen on the three sides of the cube, as it is composed of three different stickers pieced together. 、
This display effect is very realistic. In the X and Y directions, relative effects can be achieved by rotating the decal, but different effects can be achieved in the Z-axis direction. Water droplets may remain on the surface and have a tendency to bounce up. This effect can be compared with a reference to real rainwater falling on the ground.
youtube
On rainy roads at night, reflections are generated under the illumination of lights, but the road surface is not so smooth, and the resulting reflections are relatively blurry. I think the reflection will be stronger on smooth roads and weaker on rough roads. I think one of the difficulties is how the bubbles and splashes formed by rainwater on the Z-axis surface are displayed in the stickers. Due to time constraints, I referenced this Dynamic Rain blueprint. It can quickly create the effect of rain.
Reference
Vehicle Animated Rain - Waterdrop Material & FX(CemTezcan,2018) UnrealEngine. Available at:https://www.unrealengine.com/marketplace/en-US/product/animated-rain-waterdrop-material(Accessed 10 August 2024)
HEAVY RAIN on Road at Night to Sleep FAST, Rain no Thunder to Relax, Study (2023). Youtube. Available at: https://www.youtube.com/watch?v=y5ewEezvkKw(Accessed 10 August 2024)
0 notes
majorprojectmwang · 1 year ago
Text
The lighting effect of the cinema
According to my design plan, I have transformed the building in the lower right corner into a cinema, which belongs to the entertainment area. My original plan was to build a large hotel, but I don't think the hotel is that innovative, and the architectural styles of European hotels are not much different. But the design of the cinema is more modern. I added some light lines to decorate the cinema and add a mysterious atmosphere at night.
Tumblr media
On the posters in the cinema, I applied some blurring treatment because I considered the possibility of infringing on the movie's copyright and added a self-luminous effect to the posters. I also applied some dark light treatment inside the cinema.
Tumblr media
Reference
Beautiful Glowing Line Motion Graphics in After Effects (2023). . Youtube. Available at: https://www.youtube.com/watch?app=desktop&v=hgFxfeX5m_M (Accessed 2 August 2024)
0 notes
majorprojectmwang · 1 year ago
Text
Lighting Night
The basic production of the scene has been completed, and I will add two lighting effects later. One is the night lighting effect, and the other is the rain lighting effect. Every time I create a game, I really want to experience driving as a driver and driving in different weather and time environments. I hope that simulation driving games can give players the most realistic experience.
I imported a blueprint of a bus simulator for testing and began imagining myself as a driver, controlling the speed while driving. Of course, my biggest concern is the scenery outside the window, and whether the intensity of the light will affect the driver's field of vision. Conduct real tests every time while driving. Compare the observed situation with the actual lighting reference. Thus, improvements can be made.
Tumblr media
I don't know much about lighting knowledge, so I think it's also something that should be explored. Firstly, the lighting of the street lamps Lighting, there are many types of lighting, Spotlight, point light, and Rectangle light. Spotlights can be used as flashlights, streetlights, or searchlights, and the direction of the light presents a divergent angle. I mainly place these lights under the searchlights in the cinema, and different coloured lights present different effects of the cinema. Point Light is also the most common light source, which is a divergent light source that can be used for light bulbs, streetlights, and self-luminous road signs. These light sources will have the best effect at night. Rectangle light is a type of light source with parallel and single directions, mainly using various types of light panels. Most of the light in these scenes comes from indoors, and this type of light source can also appear in outdoor street-side pavilions.
Tumblr media
0 notes