Tumgik
#Forward kinematics
simplyanjuta · 1 year
Text
Making dynamic poses/animations that adjust to a sims body with Animation Tools by thepancake1
Tumblr media
Made this short guide after talking to thepancake1. I haven't seen many people use this feature yet and felt like it might be worth sharing? There are some limitations to be aware of, but I think it's a useful option (for poses as well as animations) 😊
Many thanks to thepancake1 for the tools and for the helpful explanations he provided for this guide.
1.       Background and in-game mechanics
The way TS4 handles different body shapes and clothes in animations (for example, in order to avoid clipping) is basically by putting markers (“slots”) on the surface of a sims body that can be then used as (IK) targets with the in-game IK system.
As you probably know, IK (Inverse Kinematics) – as opposed to the default FK (Forward Kinematics) – is a set-up where bones in a chain are influenced backwards. So, for example, when you move a hand, the arm will follow.
In a similar way, what in-game IK does, is assigning a bone or slot to animate relative to. For example, if your sim is posed with hands on the hips, you can assign the hands to the hips slot and the game will then process the pose/animation and perform IK in real time to change the position of the arms and hands relative to the hips.
Note that there are limitations to this system, though, as only hands, feet and the root bind can procedurally target other bones/slots.
Hereby, feet targets and root target are mainly used in interactions with objects, in particular in sitting animations (where, for example, the root is targeting a chair slot).
Hand targets are mainly used for adjusting a pose/animation to a sims body shape and clothes.
The in-game IK always influences the complete arm/leg (chain: foot-calf-thigh/hand-forearm-upperarm).
You can find an article that provides some background information here: https://simsvip.com/2014/08/20/community-blog-the-sims-4-animations/
2.       Using the feature in custom poses/animations
Per default, when you make custom poses/animations with Blender and S4S, the relevant information that the game requires to apply in-game IK is not included. And so, you will notice that while a pose might look good with the base rig in Blender, it might not fit other sims with a different shape in game, in particular causing clipping.
You can use TS4 SimRipper to fine-tune a pose for a certain sim, but this might not be the solution you’re looking for, if you share your poses and/or want them to be compatible with different kinds of sims.
The animation tools now include a feature that makes it possible to use in-game IK with custom poses/animation as described in part 1.
Disclaimer:
The animation tools are in development and subject to change.
Currently TS4 SimRipper sims are not properly supported. (Although, imo for the purpose of the task, it doesn't make much sense to use them as models.)
As you can see in the comparison below, there might be some accuracy loss for the flexibility gained. (The position of the arms in the version without IK targets matches exactly the pose I made in Blender but doesn't work at all for the heavier sim causing extreme clipping. At the same time, the version with IK targets deviates a bit from the pose I made, but works for any sim.)
Tumblr media Tumblr media
Below a short guide on how to set it up with the example pose I made.
1)      You can download the tools HERE. Make sure to check the installation guide and tutorials in the wiki tab for the basics. (Note: The tools were originally made for Blender 3.0 but also support newer versions, in particular Blender 3.3.)
Some additional tips for poses in another post of mine HERE.
After you set up the tools in Blender and have made your pose:
2)      Go to the S4AnimTools panel. Fill out rig name, clip splits and clip name as described in the tutorials linked above (also make sure to select “Allow Jaw Animation”).
3)      Find & click “Create World IK channels”. 
Tumblr media
This will create 5 IK channels for the afore mentioned hands, feet and b_ROOT_bind bone. You can also add channels either by cloning the existing ones or clicking “Create World IK channels” again. To get rid of unwanted channels click “Delete”.
4)      Set up the targeted bones/slots based on the type of pose you made.
In my example, I created a pose where a sim has the left hand on the hip, and the right hand close to the thigh. Therefore, I added IK channels targeting the “b__L_frontBellyTarget_slot” and “b__R_ThighFrontTarget_slot”.
The slots are marked blue in the picture below. Some notes:
The slots are hidden by default, I made them visible for the picture. You can unhide all bones/slots available by pressing Alt+H. But I recommend doing this on a separate rig/in another blend file or undo it directly afterwards if you don't want all the (unneccessary) bones/slots blocking your view.
The selected slots worked well enough for my example, but you should figure out what is suitable for you. (Fo example, the HandDangle slots seem to be commonly used when the arms are hanging near the body.)
As orientation you can also look up clip files for EA poses/animations via the S4S Game File Cruiser and see what bones/slots are used as targets ("Warehouse" tab -> "SlotAssignments"; IKChainIndex: 0 - left hand / 1 - right hand / 2 - left foot / 3 - right foot /4 - root). On that note: The Clip Pack export loses/resets the slot data, but you can use it to find an animation and check its Instance ID to then search for in the Game File Cruiser. (If you know the name of an animation you can also determine its Instance ID by converting the name with the S4S Hash Generator.)
5)      To ensure an animation works properly and, in particular, blends with other animations in game, each IK channel should (also) target b_ROOT of the rig.
The bones are marked green in the picture below. This set up was recommened to me by pancake. Another experienced creator mentioned though that it's only necessary to target b_ROOT at the start and end of an animation, in his experience.
Also note that this seems to be a restriction for animations that are made as in-game interactions and might not be necessary for poses or animations used with Andrews Pose player.
6)      The start and end frame is set according to the length of the pose/animation (I want to use the pose as a CAS trait pose and set the duration to 150 frames = 5 seconds), except for the b_ROOT target for hands where the end frame is set to 0 in my example, since they are also targeting the slots “b__L_frontBellyTarget_slot” and “b__R_ThighFrontTarget_slot” during the animation.
Note: My example is a static pose. In animations, however, you can also target different slots at different times by setting up multiple IK channels and specifying the start frame and end frame respectively.
Tumblr media
7)      Bake the animation by pressing “Bake IK”.
8)      Export the clip and create a package with your pose/animation as described in the linked tutorials above.
@ts4-poses​ @thefoxburyinstitute
693 notes · View notes
booleanean · 1 year
Text
Day 9 - Aphrodisiacs - Robots with Dicks
"Oh fuck, yes, yes," Carla panted.
She heard Felix scribble something on his clipboard. For an AI postdoc, he was oddly attached to pen and paper.
The Mk1's chassis had been completed weeks ago. Carla had stared at it, stood in the corner of their basement workshop, waiting for Felix to finish training the AI, until finally she couldn't take it anymore.
"Yes, harder, faster." Carla moaned as the robot's control loop interpreted the commands and thrust into her with greater intensity.
Every inch of it was as good as she could make it, artificial filament muscles covered in variably translucent silicone to visually measure performance, hydraulic actuators in the torso visible behind 3D printed transparent aluminium. Strength was about twice what the strongest human could achieve without modification or drugs, dexterity on par with the best industrial robots from ten years ago, but on a fully mobile base. It was the peak of humanoid robotics, right at the very bleeding edge of technology. Their research paper was going to omit certain additions she'd made to it, though she had been half tempted to see if they could win a Nobel and Ig Nobel for the same project.
Felix looked up long enough from his clipboard to stroke her hair.
"Feel good?"
"Oh fuck yes, I'm close."
Even the penis attachments were works of art. Integrated in a special modular pelvis, she'd created two prototypes. The one that was rocking her world right now was a basic steel shaft with an internal set of ducting keeping it at body temperature, and a separate network of microtubules dispensing lubrication along the entire length. Pressure transducers and temperature sensors fed back into the control loop, letting the robot respond to her physiological responses as well as her voice commands.
She was saving the other prototype for the full AI integration test. The basic functionality worked the same as a mammalian penis. Silicone stood in for flesh, with a body safe hydraulic fluid for blood, filling corpora cavernosa made of custom designed aerogel. It even had realistic skin that slid along the basic structure. The sensors were also inspired by biological systems, with increased density in the tip. The piece de resistance was a realistic set of testicles, weighted properly, that contained most of the operating mechanics and a fully functional ejaculation system tuned to mimic anything from a pathetic little dribble to a pressure and volume any porn star would sign away their immortal soul for.
She already had plans for another, more futuristic attachment with a direct magnetic nerve stimulator for the clit and g-spot.
"Fuck YES!" Carla screamed as she came.
The sensors in the robot's dick tripped the control loop into a new regime, keeping the same pace perfectly, matching her thrashing movements, letting her focus on nothing but her own pleasure. The impassive face, silicon lips pressed tightly together, eyes scanning her face mechanically, pulled her out of the moment a little but the perfect fucking it was delivering got her close to the edge again right on the heels of her first orgasm. Just before her pleasure peaked, the robot pulled back out of her completely and sat back on its heels between her legs.
"CONTROL LOOP FAILURE, SAFETY MODE ENGAGED."
She screamed her frustration at the abrupt feeling of emptiness and ruined orgasm so tight on the heels of such a good one.
"Fuck, that sucked." Carla tried to catch her breath. "Mk1, go stand in the corner."
"COMMAND UNCLEAR, PLEASE RESTATE."
She pushed it off the bed with her foot, the basic inverse kinematics keeping it stable as it shifted to the floor. At least that was still working.
"Walk forward four paces, turn forty-five degrees clockwise, walk two more paces, then go into standby mode."
Carla pulled at Felix's shirt, trying to get it off over his head while he tried to hold on to his clipboard.
"Are you going to take notes, or are you going to fuck me? The Mk1 clearly isn't up to the task yet."
He froze, then tossed his clipboard aside. They kissed as he fumbled his pants off. He was inside of her seconds later, rock hard. They'd fantasized together about being with other people, but never wanted to make it a reality. The fantasy was hot, real people was too far for both of them. When they'd been working on the Mk1 together, Carla had suggested a little side project. Felix clearly really got off on seeing her with it, he was rock hard.
"Yes, fuck me, fuck me." Carla rocked her hips against him, meeting him thrust for thrust. She held his gaze, urging him on. Within minutes, he had her back at the edge, years of being together had taught him exactly what she liked. She held herself there, holding back, waiting for him.
"Cum in me, fill me, yes, YES!" Carla felt Felix stiffen inside of her and then warmth flooded her. She let go and screamed his name as she came, "Felix, Felix, Felix!"
When he collapsed on top of her, she stroked his back. He was still inside of her, and she could still feel the occasional twitch of his cock.
"Of course! There was no path from the 'partner orgasm occurring' state into the 'partner orgasm starting' state! God damnit, I forgot to account for multiple orgasms in quick succession. Fuck."
Felix kissed her, muffling her last word. He pushed up on his hands, hovering over her, still inside. "The tensor farm should be done testing the latest model by 8pm. If this one is all green, we can probably have it installed by 10 and give it another shot with the AI this time."
"I'll rewire the state machine for the control loop in case there's any red tests still."
This had been their sex life for the last couple of months, since they started the project to build the ultimate sex bot. After, often with Felix still inside of her, they'd discuss ideas about what they could change, or features they had to have. This was the first time after a field test though.
"Did you like watching me with it?"
"Oh yes. God damn, that was hot."
"Would you ever want to try it? Both cocks are self lubricating, you know."
"Mmm, maybe. I want to see if we can get a threesome mode working first though."
After dinner they guided the bot back onto its stand in the basement workshop with a dozen cables leading to various parts to extract telemetry, recharge, and provide data connections for reprogramming. Carla was getting distracted trying to rewire the state machine, each possible transition suddenly causing both real and imagined sense memories. Felix looked tastier and tastier as she worked. He was futzing with parameters, rerunning partial tests on subsystems. The tests had all been green, but he'd had ideas to get everything optimized before their first live test.
By 2am, they had the first version of Felix's AI uploaded to the Mk1. She and Felix had curated a lot of videos from Pornhub over the last couple of months, finding performances they liked. Lots of hotwife scenes and threesomes, some bisexual stuff, but mostly relatively vanilla scenes. Carla had added some scenes where the male performer was a bit more rough than Felix was comfortable with doing himself, spanking and pinning wrists above heads. For vocal interactions, they'd retrained a large language model on erotica and textual descriptions of the scenes in the porn videos, generated by an off the shelf accessibility AI.
There wasn't any actual universal intelligence in the robot of course. This was a sexy version of an AI chat bot that most phones had built in now, combined with a convoluted control loop for its physical interactions. Simply a very clever way of giving the impression that something was smart, when really all it was doing was basic pattern recognition based on a predefined dataset.
"Want to give it a shot?" Felix asked, but Carla shook her head.
"I want you, not the bot."
Shutting the bot down for the night, Carla drew Felix upstairs back to their bedroom. As they made love, they teased each other with all the amazing things they'd do with the robot tomorrow and in the weeks to come.
The next weekend, Carla really had to admit Felix had been right. Her control loop version of the robot's software was good. It got her off just fine, but it was impersonal. As its designer, she had a hard time focusing on herself as she felt it roll into new control regimes. The AI felt much more human. He looked at you, used his hands for more than balance, and even showed some imperfections in his motions. He got (artificially) winded, slipped out, fell over, all the things a real human partner would do. The experience was so much more realistic, she sometimes forgot it was a robot fucking her if she couldn't see him.
She sat at her desk in their upstairs office now, working on the more serious portion of her research. They had run a series of strength and dexterity tests that afternoon, characterizing the robot's ability to maintain precision while assert force at different levels, and she was processing the data. Felix was downstairs in the lab, tinkering with parameters and adjusting the training data for the next version of the AI.
Carla heard the neighbor plug in his bass guitar, the amp turned way up. She muttered under her breath about people not respecting their tools. Didn't he know he could damage the speaker like that? The noise wasn't too bad, but listening to Seven Nation Army played by a spirited amateur over and over again didn't really appeal either. Her noise cancelling headphones were in the basement with Felix though, so for now she'd just suffer through.
Her phone beeped halfway through the neighbor's warmup.
Felix: Robot reacting to bass music. Carla: "music" Felix: He's getting better. Anyway:
The next message was the robot's dick, the biomorphic one, clearly at half mast.
Carla: Is he on? Felix: in standby Carla: Odd. Sensors recording? Felix: Yup, caught it before the buffer flushed. AI parameter log too. Carla: nice
Before she could really get back into her work, the neighbor finished Seven Nation Army. The next tune he played was the Pornhub sting. She almost spat out her drink. He did a pretty good version, though the lack of drums made it not quite perfect.
Before she could get back into her work, Felix yelled from down in the basement.
"Carla, come take a look at this!"
The Mk1 was standing in its alcove, still docked to all the various wires and cables. Felix was standing in front of it, studying the biomorphic cock. It was throbbing like a real one would.
"Remember how it was at half mast during the first song the neighbor was playing? Despite it being in standby? I think I figured out the reason."
The neighbor, who had just finished House of the Rising Sun, chose that moment to play another couple of Pornhub opening stingers. The Mk1 responded, humping the air slightly, his cock throbbing.
"You didn't cut out the intros on the training data so—"
"— now every time it hears bass music, and the Pornhub riff in particular, it gets aroused. It's still in standby, it's barely drawing current, but there's enough residual charge in the artificial muscle fibers for, well, this." He gestured at the robot's midsection, still rocking back and forth.
"Aren't the tensor cores supposed to be off?" Carla watched a slow drop of lubricant fall from the tip of the twitching robot cock.
Felix shook his head, "Some stay on to parse voice commands."
She reached out, touching the silicone cock. It was slick, the lubricant dispensers clearly activated. It was interesting to see that it was apparently simulating precum as well, despite that not necessarily being the focus of their training data. The artificial dick twitched at her touch, and she grasped it firmly, stroking up and down. A slow trickle of fake cum was leaking out the tip now, covering her hand.
"So we're thinking bass guitar is a robo-aphrodisiac then? Because you trained it on videos with Pornhub intros?"
"Mm-hmm."
"That's hilarious."
"And means I have to remove the intro from over fifteen hundred videos, and then retrain and retest the entire model." Felix sighed heavily. "Again."
"There's an ffmpeg command for that, surely."
"The trimming, sure."
Carla kept stroking the robot's cock, watching the artificial foreskin slide back and forth over the head.
"Seems like a shame to waste this though. It really shouldn't be erect out in the open air for too long, it's designed with the idea of at least some counter pressure. Also, it would be a shame to not gather some extra data. It might be interesting to have a robo-aphrodisiac function, though maybe something more specific that won't just trigger if someone forgets to unplug their Bluetooth speaker when they're going to rub one out."
Felix grinned at her, then nodded.
Carla pulled her sweat pants and top off, standing naked in front of the mechanical man. "Mk1, wake up."
The Mk1 went through his wakeup sequence, part mandated by technology, part for show because they were both massive nerds. The cables, mostly plugged in along his arms and back, ejected and retracted into the alcove like Neo waking up in the real world for the first time. The sound effect of Seven of Nine's alcove powering down at the end of her regeneration cycle played, and Mk1 took a single step forward.
"Hello Carla, nice to see you again. What would you like to do today?"
Felix had campaigned long and hard for the robot to say "Please state the nature of the sexual emergency" but eventually she had put her foot down. The chances of that ending up in a version they showed off at their defense were too high, and while Robert Picardo could get it, the Doctor was a bit too acerbic for her tastes.
She walked over to the mattress they kept in the basement for quick tests, standing at the foot. She was in the mood for something a bit more rough than just the vanilla stuff they'd tried with the bot so far, and this heightened state it was operating in seemed to be a perfect opportunity to try that out.
"Take me. Be a little rough."
Before, he'd always asked for confirmation before initiating anything sexual. It hadn't been hardwired, but the AI training data was heavily incentivized towards asking consent first. This time though, with three long, powerful strides he was inches away from her. The intensity of his movements were a little scary, but she had the utmost faith in her and Felix's work. Still, she took a half step back reflexively.
"Are you sure this is a good idea?" Felix asked.
Carla stood staring at the Mk1, transfixed by his gaze. She knew it was just servos and cameras and tensor cores running a neural network, there was nothing there, but she still couldn't look away.
"I need this," Carla whispered.
With that, Mk1 took one more step, pushing her over and onto the mattress. He guided her down as they tumbled, cushioning her fall a little and making sure her head didn't hit the ground, but it was still an intense experience.
Decided to fight a little, she tried to push him off. He gathered her wrists in one hand and effortlessly pinned her arms above her head.
"Pause," Carla said.
Immediately, the Mk1 froze. He still held her, but the pressure on her wrists was lower, and he held all his weight off of her.
"Good, that still works just fine. Resume."
The intensity the Mk1 showed was unreal. She'd enjoyed him before, but with this added level of robotic arousal added on top, she could finally completely lose herself in the act. There was no room for thinking about kinematics and control loops, muscle fiber force limits, defects, or additions to the training data. There was no worry about her partner's pleasure, no anxiety for her own performance. All that was there was her own pleasure, pure and uncomplicated.
She fantasized about a future where a Mk2 and Mk3 could join in with the Mk1, taking turns getting her off, letting them recharge and refuel in shifts as they spent an entire day teasing her from orgasm to orgasm.
Mk1's synthesized voice, indistinguishable from human despite being produced by a speaker rather than a voice box, let her know how good this felt for him. All artificial of course, but so necessary for a realistic experience. Soft moans, grunts, little gasps. Even simulated breathing growing shorter as he exerted himself. It had still sounded artificial to her previously, but now it just went straight to the pleasure center of her brain, letting her enjoy the moment even more. She came, crying out as he whispered her name in her ear.
Just as her wrists were beginning to hurt, he shifted, pulling her legs up against his chest. The new position let him reach new and interesting places inside of her, the intentional curve she'd put on his cock letting him hit her g-spot. As she approached her second orgasm of the afternoon, he started moaning louder, grunting. When she came, so did he. The twitching of his cock was entirely lifelike, his orgasm forceful enough she could feel it deep inside of her.
She lay there panting, and he emulated her, letting her bask in the moment. Felix had sat next to her on the mattress, watching her closely. She could see his erection clearly in his sweats.
"That looked intense," he said when she looked over to him.
"Oh yes. We definitely need this feature."
"Would you like to continue?" the Mk1 asked.
Carla flicked her gaze down to Felix's sweats then looked him in the eyes. "Join us?"
Felix grinned and started pulling his shirt off.
42 notes · View notes
rainacademia · 2 years
Text
Tumblr media Tumblr media
october 11th, tuesday
attended classes from 10 to 5 today. it was hectic but we had a week off for pujas so the teachers rushed to complete the left bits.
wrapped up kinematics + thermodynamics + colligative properties.
currently solving the pyqs from CP.
dreading tomorrow's lecture on anatomy of cockroach. this is the only part in zoology that i do not look forward to studying.
🎧 : illwalkyouhome by woven in hiatus
311 notes · View notes
usafphantom2 · 4 months
Text
Tumblr media
U.S. Army Officer Confirms Russian A-50 Radar Jet Was Shot Down With Patriot Missile
The U.S. Army colonel described how Ukrainian Patriot operators staged a “SAMbush” to bring down the A-50 in January of this year.
Thomas NewdickPUBLISHED Jun 10, 2024 6:55 PM EDT
The Beriev A-50U ‘Mainstay’ airborne warning and control system (AWACS) aircraft based on the Ilyushin Il-76 transport aircraft belonging to Russian Air Force in the air. ‘U’ designation stands for extended range and advanced digital radio systems. This aircraft was named after Sergey Atayants – Beriev’s chief designer. (Photo by: aviation-images.com/Universal Images Group via Getty Images).
Tumblr media
A U.S.-made Patriot air defense system was responsible for shooting down a Russian A-50 Mainstay airborne early warning and control (AEW&C) aircraft over the Sea of Azov on January 14, according to a U.S. Army officer. The high-value aircraft, one of only a handful immediately available to Russia, was the first of two brought down in the space of five weeks. Previously, a Ukrainian official confirmed to TWZ that the second A-50 was brought down by a Soviet-era S-200 (SA-5 Gammon) long-range surface-to-air missile.
Speaking on a panel at the United States Field Artillery Association’s Fires Symposium 2024 last week, Col. Rosanna Clemente, Assistant Chief of Staff at the 10th Army Air and Missile Defense Command, confirmed that the first A-50 fell to a German-provided Patriot system, in what she described as a “SAMbush,” or surface-to-air missile ambush.
Tumblr media
“They have probably about a battalion of Patriots operating in Ukraine right now,” Col. Clemente explained. “Some of it’s being used to protect static sites and critical national infrastructure. Others are being moved around and doing some really, really historic things that I’ve haven’t seen in 22 years of being an air defender, and one of them is a SAMbush … they’re doing that with extremely mobile Patriot systems that were donated by the Germans, because the systems are all mounted on the trucks. So they’re moving around and they’re using these types of systems, bringing them close to the plot … and stretching the very, very edges of the kinematic capabilities of that system to engage the first A-50 C2 [command and control] system back in January.”
Fifteen crew members were reportedly killed aboard the A-50.
Col. Clemente also provided some other interesting details of how the Ukrainians worked up their capabilities with these particular systems, which included a period of validation training involving the U.S. Army in Poland in April 2023.
Tumblr media
Elements of a German Patriot air defense system stand on a snow-covered field in Miaczyn, southeastern Poland, in April 2023. Photo by Sebastian Kahnert/picture alliance via Getty Images
According to Col. Clemente, the German soldiers tasked with training the Ukrainians on the mobile Patriot systems woke up the Ukrainian battery in the middle of the night, marched them to a location where they fought a simulated air battle, and then made them march again. “I was like, ‘Huh, wonder why they did that?’ And it was a month later, they conducted some of their first ambushes where they’re shooting down Russian Su-27s along the Russian border.”
As we reported at the time, the use of Patriot to engage the radar plane over the Sea of Azov seemed likely, especially as it followed the pattern of an anti-access counter-air campaign that Ukraine was already waging against Russian military aircraft using the same air defense system.
Accordingly, in May 2023, Ukraine began pushing forward Patriot batteries to reach deep into Russian-controlled airspace. Most dramatically, a string of Russian aircraft was downed over Russian territory that borders northeastern Ukraine. Among them may have been the Su-27s (or perhaps another Flanker-variant aircraft) that Col. Clemente mentioned.
Tumblr media
A screen capture from a Ukrainian Air Force video shows images of three Russian helicopters and two Russian fighters painted on the side of a Patriot air defense system. The three helicopter and two jet images bear the date May 13, 2023. Defense Industry of Ukraine
While the use of German-supplied weapons within Russian territory previously led to friction between Berlin and Kyiv, German officials more recently approved the use of Patriot to target aircraft in Russian airspace.
In December 2023, similar tactics were used against tactical jets flying over the northwestern Black Sea.
Tumblr media
These kinds of highly mobile operations were then further proven with the destruction of the first A-50, on the night of January 14.
Tumblr media
A Russian Il-22M radio-relay aircraft was also apparently engaged by Ukrainian air defenses the same night, confirmed by photo evidence of the aircraft after it had made it back to a Russian air base. It’s not clear whether the Patriot system was also responsible for inflicting damage on this aircraft, but it’s certainly a probable explanation.
Tumblr media
A photo of the Il-22M which purportedly made an emergency landing in Anapa, in the Krasnodar region of western Russia. via X
Both incidents appear to have taken place in the western part of the Azov Sea and, as we discussed at the time, the distances involved suggested that, if Patriot was used, it was likely at the very limits of its engagement envelope.
Based on Col. Clemente’s account, it seems likely that the Patriot system in question was not only being pushed to the limits of its capabilities but was likely being deployed very far forward in an especially bold tactical move.
As we wrote at the time: “Considering risking a Patriot system or even a remote launcher right at the front is unlikely, and these airborne assets were likely orbiting at least some ways out over the water, this shot was more likely to have been around 100 miles, give or take a couple dozen miles.”
Of course, all this also depends on exactly where the targeted aircraft were at the time of the engagement.
Tumblr media
A map showing the Sea of Azov as well as Robotyne, which is really the closest Ukraine regularly operates to that body of water, a distance of roughly 55 miles. Google Earth
Once again, the A-50 shootdown may be the most important single victory achieved so far by Ukrainian-operated Patriot systems, but it was part of a highly targeted campaign waged against the Russian Aerospace Forces which has seemingly included multiple long-range downings of tactical aircraft.
The Ukrainian tactics first found success in pushing back Russian airpower and degrading its ability to launch direct attacks and even those using standoff glide bombs, which have wreaked havoc on Ukrainian towns.
The same anti-access tactics extended to Russia’s small yet vital AEW&C fleet have arguably had an even greater effect. After all, these aircraft offer a unique look-down air picture that extends deep into Ukrainian-controlled territory. As well as spotting incoming cruise missile and drone attacks, and low-flying fighter sorties, they provide command and control and situational awareness for Russian fighters and air defense batteries. According to Ukrainian officials, the radar planes are also used to direct Russian cruise missile and drone strikes.
Tumblr media
Dmitry Terekhov/Wikimedia Commons
The importance of these force-multipliers has seen earlier efforts to disable them, with A-50s in Belarus having been targeted by forces allied with Ukraine.
The recent appearance of a photo showing a Ukrainian S-300PS (SA-10 Grumble) air defense system marked with an A-50 symbol also indicates that previous attempts were made to bring these aircraft down using this Soviet-era surface-to-air missile, too.
With all this in mind, it’s not surprising that Ukraine’s highly valued, long-reaching Patriot air defense system was tasked against the A-50.
In demonstrating the vulnerability of Russian aircraft patrolling over the Sea of Azov, the January 15 shootdown might have been expected to push these assets back. That may have happened, but another example was then shot down at an even greater distance from the front line, on February 23. The fact that the second A-50 came down over the Krasnodar region fueled speculation that it may have been a ‘friendly fire’ incident.
However, Lt. Gen. Kyrylo Budanov, the head of the Ukrainian Ministry of Defense’s Main Directorate of Intelligence (GUR), subsequently confirmed to TWZ that the second A-50 — as well as a Tu-22M3 Backfire bomber, in a separate incident — were brought down by the Soviet-era S-200 long-range surface-to-air missile system.
Undoubtedly, there are more details still to emerge about the shootdowns of the two A-50s, not to mention other engagements that the Ukrainian Patriot has been involved in.
However, Col. Rosanna Clemente’s comments confirm that the Ukrainian Air Force is using these critical systems in a sometimes-daring manner, using limited numbers of assets not only to protect key static infrastructure but also to maraud closer to the front lines and bring down high-profile Russian aerial targets. Not only does this force Russia to adapt its airpower tactics for its own survival, reducing its effectiveness, but it also provides another means for Ukraine to fight back against numerical odds that are stacked against it.
Contact the author: [email protected]
18 notes · View notes
academiawho · 5 months
Text
130 Day Productivity Challenge!
30 April, 2024 - Day 130
Tumblr media Tumblr media Tumblr media
And with this post, I have achieved 130 days of productive work and continuous study.
I wrote a mock test in the morning to check my spirits. It went fine, I noted down my doubts in it and surely aim to resolve them in time.
I revised some basic maths, Units and Dimensions, Kinematics, Morphology in flowering plants, Ecology, Plant Growth, Evolution and Plant Families before my test today.
I'm getting 677/720 in today's test, a good improvement from yesterday's score. And today's paper was of a medium-difficult level.
I updated my mistake book and went through the topics in NCERT.
I revises NCERT of Phy book 1 through summaries and points to ponder.
Hope you had a good day💛
⭐✨⭐✨⭐
This productivity streak has been successfully completed and I find it exciting to move forward and work further. NEET 2024 4 days away and for the first time, I'm excited.
My ask box and messages are always open to students and friends interested in mutual topics and easy conversation. All the very best to you all! It was such a treat being on this journey 🌈💛
11 notes · View notes
admitonegame · 22 days
Text
Admit One Dev Blog: Update 46 - Rigging (Part 2)
This week we have another quick rigging update:
Tumblr media
The basic skeleton is now complete! It may need a little polish here and there but it's in a place where I'll be comfortable with making controls for it as well setting up the Inverse/Forward Kinematics. While the mesh isn't final at the moment, I'd like to get to a place where I can start making custom animations for both the Player and human-like enemies. New animations will allow me to make a lot more progress when it comes to improving the moment to moment gameplay and hopefully I'll be to retain these animations when I update both the Player and enemy models later on. Thanks for reading, I'll catch you next week!
4 notes · View notes
gender-trash · 10 months
Note
I had to go to a department seminar today for a requirement and the talk was 'Modeling, Estimation, and Control of Quadrupedal and Humanoid Robot Locomotion in Non-Inertial Environments' and it was interesting but I kept thinking 'darn, I bet gender-trash would love this' It was essentially, hey can we get a robot to walk in a straight line while on a rocking ship?' and the answer was 'lol. sorta kinda' anyhow I hope you have a nice evening :)
:0 this is really cool!! i looked up the person giving the seminar (yan gu) and found this video linked from one of the papers she coauthored:
youtube
(n.b. that i just skimmed this paper and am definitely not bothering to work through all the math, but) the key assumptions this makes are that 1) the robot can perceive or otherwise know the movement of the surface relative to itself (in the lab they use fiducials stuck to the treadmill, and hint vaguely in the paper that integrating sensor data from the ship or whatever would be plausible in a real-world deployment, which -- PERSONALLY i am much less confident about, as someone who has been socially adjacent to industry work on robot integration with elevator controllers, but whatever, it's research, i'm willing to cut them slack on that) and 2) the walking surface is planar (for the biped the surface position is determined from forward kinematics assuming the robot's feet always make full contact with the surface when it does a steppy; i read the quadruped paper much less thoroughly but given how careful they were to select a gait that always has three feet in contact with the ground i'm assuming the same condition holds).
DEFINITELY a big improvement over "let's assume the floor does not move :)" control, and i don't mean to be critical here, i think it's perfectly fair for a controls paper to leave the software integration/perception challenges up to Future Work to figure out! this is just how i personally analyze robotics research -- there's always a bunch of assumptions involved to make a robot problem tractable, some of them more realistic than others, so the most important thing i want to understand about a new research thing is what assumptions they're making. (for example, a lot of navigation research assumes no sane person would design or construct a building like the stata center...)
(also, really makes you appreciate how average humans can walk on wildly pitching ships with zero perception, just pure IMU, surface contact sensing, and proprioception/kinematics. take a moment to be grateful for your cerebellum :p)
anyway -- thanks for the pointer, i always love seeing awesome new controls-y stuff! anon, i assume you know most of this stuff already, but for followers interested in learning more about controls for leggy bois, i always recommend russ tedrake's underactuated robotics class -- he has his very well-written lecture notes and several years worth of lecture videos all available online. thanks prof tedrake i love you <3
9 notes · View notes
Note
What are some of your current blender projects? Or just things you like abt the program and hobby?
Hi sorry I meant to answer this several days ago but I kept forgetting until right before bedtime and I knew this would not be short... Thank you for the ask!
Ok so I don't know what your definition of "current projects" is but there's nothing I'm actively working on right now, I'm just playing Minecraft all day, but I have many wips in various degrees of being abandoned, most recent being an alternative to https://minetrim.com/ that would run in Blender and be controlled by a geometry nodes modifier. I do hope I get around to finishing it, since the online tool as it currently exists is lacking in many aspects and can be a little buggy. It probably wouldn't be all that useful for most players since you'd need to install Blender and understand a few basics of how to use it, but it would at least help me plan my armor trims (if you don't know, Minecraft recently added a system for cosmetic customization of the player's armor combining colors and patterns, and the website and my tool are meant to simulate user-selected combinations to see what it'd look like).
And that is one of my favorite things to do in Blender, create little tools with geometry nodes, which is basically a visual simplistic programming language interacting with many of the things Blender does. I can create a customizable banana with randomized shapes and spot locations with a slider for age, length, curviness, thickness, you name it, and produce a photorealistic banana (this is one abandoned project), or I could make a regular polyhedron generator taking only a Schläfli symbol as input (another abandoned project, here's a great video that inspired me to try, I highly recommend it if you have no idea what I just said), or, as mentioned, an armor trim simulator.
But Blender can do so much and I can't talk about it without mentioning that it is Completely Free And Open Source and it's good for so much more than just 3D modeling. You can of course add materials to the model, defining exactly how the surface interacts with light, defaulting to what is physically possible but not limiting to this, allowing you to create every possible and a wide range of impossible materials. And then of course you can render that, with several rendering engines to choose from depending on the look you want and the computing power and time you're willing to invest, but I usually use the ray tracing engine (simulating rays of light for most realistic result, which you shouldn't do in video games much because games have to render in real time please don't conflate my use with the crimes of AAA games). But why stop at a still image? You can animate the model, and the material if you want, and you can animate basically any property, and of course to make animation more interesting you can rig it to a skeleton (usually how characters are animated) (includes inverse and forward kinematics of course) or do physics simulation including fluid/smoke/fire simulation, soft and rigid body simulation, and cloth simulation. They've made some changes to simulation and hair since I last looked at those aspects so I'm not totally in the loop on the details but it's good and only getting better. Ok cool you've got your little animation and you can render it to a little video, neat, but it's just in a void? Do you have to model the whole background too? Well, you can, or (I'm gonna oversimplify and gloss over a lot of differences and unique challenges of each method here) you could use an HDRI (high dynamic range image, meaning it contains A Lot more information about lighting from a much wider range of values than a normal image... you know how you point your phone camera at a light source and the image goes dark? That's because your phone camera has a lower dynamic range than human eyes, it can only see a small range of light or dark values at a time), or, hey, if you do have a video camera, you can try something really fun... camera tracking. If you film a video of real life, with the camera moving around, you can plug that video into Blender and with some help it can figure out exactly how the camera is moving through space! This means, you can make the virtual camera inside Blender move the exact same way around the animation you made, and you can then render that video and lay it over the footage you took and BAM your 3D object is now in that scene! It might still look off, of course, if it doesn't cast a shadow on the ground or if it's reflective and clearly not reflecting its surroundings, but there are solutions to all these issues and if you're like me it'll turn into a fun little puzzle.
And that's just. Just scratching the surface ok? I just. Love Blender. I love that it's free and open source. If you have a computer and time you can just. Make a movie with amazing special effects and yes yes we love practical effects but trust me digital effects are not evil they're just overused because it's not unionized so it's cheaper labor but you can't tell me it isn't cool that you could make a photorealistic video of a dragon landing on your own rooftop without paying for software or putting up with ads or risking malware with piracy it's legal it's free it's fun. I will not tell you it's not time consuming but I will tell you it's cool and free*
*I recognize not everyone has free access to a video camera, internet, a computer, and enough electricity to run it
Anyway that's my summary of why I love Blender. "That's not a summary, Maws." Trust me. This is the short version.
2 notes · View notes
gt-brainrot · 2 years
Text
The Librarian's Guest
Art Thatcher was quite looking forward to their retirement. They used the word lightly, of course, there wasn't really any retiring from borrowing, but she thought she had gotten about as close as a person could get. She had eventually taken the plunge, leaving their previous dwelling and setting course for that palace of knowledge, the local library. With a community garden situated right nearby, and a treasure trove of barely perused books tucked away at the back of the building, by the corner of speculative fiction and biographies, she had everything she needed to find a quiet life in the pages.
Art had first noticed Tildy because she sang. Not because she was particularly loud, and certainly not because she was any good at it, but because she was singing in a library, and because if there was one thing Art had learned about humans, it was that they liked quiet in their libraries. Granted, most of this singing happened after the library was closed, when the guests had left, and the human assumed she was alone with her books. She really couldn't have known that Art had just moved in, camping out in her dusty little corner, nestled into the rather obscenely large translation of A True Story that had become her first reading project. Besides, it wasn't like her singing was bad, or anything. Art was more than capable of putting up with it.
At least, that's what she thought when she first moved into the library.
Art had barely been there a week before she was looking forward to when Tildy came by to re-shelf the biographies, singing her tune. A month later, and she had heard enough to start half-mindedly humming along. It may have interrupted her reading, but Tildy was still diurnal, and Art had all night to catch up on their stories. If anything, Art saw her daily song as a wake up call, an alert that business was done for the day, and that the library would be free to peruse until dawn.
And so, every evening, when Tildy sang and restocked the shelves, Art packed up her lunch, made a nest between the pages, and sang along.
~~~
Now Mathilda Cobbler had first met a borrower some forty years prior. He was quite nice, a self-professed scholar who had come to the library in search of knowledge on astronomy. Tildy had only noticed him because of his bungled attempt at reaching said knowledge, which was located on a shelf nearly four feet off the ground. Of course, being a scholar, his climbing skills were approximately on par with those of a rat stuck in a bathtub, leaving him to be found by Tildy, hanging between two shelves by some twine caught around his sickly twisted knee.
It had been slow going at first, as Peter spent his first few nights deathly afraid of her, and Tildy had no doubts that if he had been well enough to walk, she never would have seen him again. Thankfully, by the time he had recovered from his injuries, his fears had diminished, and he ended up staying with Tildy for nearly five years, reading every book on astronomy, physics, kinematics, and the movement of celestial bodies there was to find. When he eventually set off on the long journey home, Tildy was sad to see him go, but she was young, and busy, so she moved on quickly. In the intervening decade and a half, she had nearly forgotten about the whole affair, but she still kept Henry’s room, the place the two of them had made together, in the remnants of an old shoe box, and her odd little roommate never entirely left her mind.
So it was that, when Tildy noticed that her evening karaoke sessions had picked up an accompanying vocalist, she was not overly surprised.
Not that she went rushing over to introduce herself, no, she remembered how terrified Peter had been of her at first, Tildy knew that if she wanted any chance of befriending her new guest, she would have to be subtle. And so she began baking.
17 notes · View notes
neelkamble · 2 months
Text
Recent Advancements in Digital Character Creation: MetaHuman and Marvelous Designer Integration in UEFN
Epic Games has made significant strides in democratizing high-quality character creation and animation tools through the integration of MetaHuman and Marvelous Designer capabilities into Unreal Editor for Fortnite (UEFN). This development marks a pivotal moment in the accessibility of advanced digital human and clothing design technologies for a broader range of creators.
Optimizing MetaHumans for UEFN
A key achievement in this integration was the substantial reduction in the average size of a MetaHuman character from nearly 1GB in Unreal Engine 5 to just 60MB in UEFN[1]. This optimization was achieved through several techniques:
Creating a new LOD (Level of Detail) table derived from original Unreal Engine MetaHuman LODs
Implementing animation scaling options
Reducing texture sizes and baking out Normal Maps
Simplifying materials used for MetaHuman faces
Optimizing grooms and reducing their cook sizes
These optimizations not only make MetaHumans more accessible to UEFN creators but also ensure better performance in diverse project contexts.
Enhanced Animation Workflows
The integration brings the full feature set of the UE MetaHuman Plugin directly into UEFN. This allows creators to:
Generate MetaHumans from existing meshes, footage, or using the MetaHuman Creator web app
Animate MetaHumans based on performances captured on iPhone or professional HMCs
Apply animations to any MetaHuman or compatible Fortnite character
Digital Clothing Pipeline
In collaboration with CLO, Epic Games has introduced a new pipeline for digital clothing creation that bridges Marvelous Designer and Unreal Engine. Key features include:
A new USD export option for garments in Marvelous Designer
Direct translation of clothing panels and parameters into Unreal Engine
A flexible Cloth Asset Editor in Unreal Engine with both automated and manual setup options
Kinematic Collider for more accurate and efficient cloth simulations
This integration streamlines the process of creating and implementing realistic, simulated clothing for digital characters, significantly reducing the barriers to high-quality costume design in game development.
The integration of MetaHuman and Marvelous Designer capabilities into UEFN represents a significant leap forward in accessible, high-quality character creation tools. By reducing file sizes, streamlining workflows, and providing powerful yet user-friendly tools, Epic Games is empowering a wider range of creators to produce professional-grade digital humans and clothing for their projects.
References:
Monsen, J., 2024. Integrating MetaHuman & Marvelous Designer Capabilities into UEFN. [online] 80 Level. Available at: https://80.lv/articles/epic-games-on-integrating-metahuman-marvelous-designer-capabilities-into-uefn/
0 notes
maacsatara · 2 months
Text
Rigging Techniques for Character Animation
Character animation is a vital aspect of creating lifelike movements and expressions in animated films, video games, and other forms of digital media. One of the most critical steps in this process is rigging, which involves creating a skeletal structure that can control the character's movements. Rigging is an art and a science, requiring both technical proficiency and creative problem-solving. At an esteemed animation institute in Pune, students learn the intricacies of rigging, mastering both foundational and advanced techniques to bring characters to life. Here, we'll explore various rigging techniques for character animation, from basic principles to advanced methods, providing a comprehensive guide for animators at all levels.
#### Understanding Rigging
Rigging involves creating a digital skeleton for a character, known as a rig, which includes bones, joints, and control systems. These components work together to enable the animator to manipulate the character in a realistic and fluid manner. A well-rigged character can perform a wide range of movements, from simple actions like walking and running to complex motions such as dancing or fighting.
#### Basic Rigging Techniques
1. **Creating the Skeleton**:
   - The first step in rigging is creating a basic skeleton that aligns with the character's anatomy. This skeleton consists of a series of bones connected by joints. Each bone corresponds to a specific part of the character, such as the arm, leg, or spine. The placement of these bones must be precise to ensure realistic movement.
2. **Joint Placement**:
   - Proper joint placement is crucial for realistic animation. Joints should be placed at natural pivot points, such as the shoulders, elbows, knees, and hips. Misplaced joints can result in unnatural or awkward movements, so it's essential to study the character's anatomy and plan accordingly.
3. **Forward Kinematics (FK)**:
   - FK is a technique where the animator controls each bone individually, starting from the root and moving outward. This method is intuitive and straightforward, making it suitable for animating simple, linear movements. However, FK can be time-consuming for complex animations as it requires adjusting multiple bones to achieve the desired pose.
4. **Inverse Kinematics (IK)**:
   - IK is an advanced technique that allows the animator to control the end effector (e.g., the hand or foot) while the software calculates the positions of the intermediate bones. This method simplifies the animation process, especially for movements where the end effector needs to stay in a fixed position, such as a foot remaining on the ground while the body moves.
#### Advanced Rigging Techniques
1. **Blend Shapes**:
   - Blend shapes are used to create facial expressions and other deformations that cannot be achieved with bones alone. By morphing between different shapes, animators can produce a wide range of expressions and subtle movements. This technique is particularly useful for lip-syncing and emotional performances.
2. **Driven Keys**:
   - Driven keys allow one attribute to control another, enabling complex interactions between different parts of the rig. For example, a driven key can be used to automatically adjust the position of the fingers when the hand is rotated. This technique helps streamline the animation process and ensures consistency.
3. **Set Driven Keys (SDK)**:
   - SDKs are a variation of driven keys that provide even more control. They allow animators to create custom controls that can drive multiple attributes simultaneously. This method is especially useful for creating advanced facial rigs and other intricate animations.
4. **Constraints**:
   - Constraints are used to link the movements of different objects or bones. For example, a point constraint can be used to attach a prop to a character's hand, ensuring it moves in sync with the hand. Constraints can also be used to create complex mechanical systems, such as gears and pulleys.
5. **Spline IK**:
   - Spline IK is a technique used for animating flexible, snake-like movements, such as tails, tentacles, or spines. It involves using a spline curve to control the positions of the joints along the length of the object. This method provides smooth, flowing movements and is ideal for characters with elongated appendages.
6. **Muscle Systems**:
   - For highly realistic animations, muscle systems can be used to simulate the underlying muscle structure of the character. These systems deform the character's mesh based on the movements of the bones, creating lifelike bulges and contractions. Muscle systems are particularly useful for animating creatures and human characters with realistic anatomy.
#### Best Practices for Rigging
1. **Plan Ahead**:
   - Before starting the rigging process, it's essential to have a clear plan. Understand the character's anatomy, movement requirements, and any specific needs for the animation. Planning ahead can save time and prevent issues later in the process.
2. **Keep It Simple**:
   - While advanced techniques can enhance your rig, it's essential to keep the rig as simple as possible. Overcomplicating the rig can make it difficult to animate and troubleshoot. Focus on creating a functional, efficient rig that meets the needs of the animation.
3. **Test Thoroughly**:
   - Regularly test the rig throughout the creation process to ensure it performs as expected. Test a range of movements and poses to identify any issues early on. This approach helps catch problems before they become more challenging to fix.
4. **Use Reference**:
   - Reference materials, such as anatomy books, videos, and real-life observations, can be invaluable when rigging a character. Study how real bodies move and deform to create more realistic and believable animations.
5. **Continuous Learning**:
   - Rigging is a complex and ever-evolving field. Stay up-to-date with the latest techniques and tools by participating in online communities, attending workshops, and following industry trends.
### Conclusion
Rigging is a foundational skill in character animation, enabling animators to bring their characters to life with realistic and expressive movements. By mastering basic and advanced rigging techniques, animators can create versatile rigs that meet the demands of any animation project. Whether you're a beginner or an experienced professional, understanding and implementing these techniques will enhance your ability to produce high-quality character animations.
0 notes
usafphantom2 · 1 year
Text
Tumblr media
Typhoon pilot wearing the ‘Warty Toad Hat’ (WTH). Electric hats have changed the game.
Nope. Not anymore. Not for a long time. Helmet mounted displays changed the game a long time ago. Early versions were fielded by the South African Air Force and then on aircraft such as the MiG-29. We all got incredibly bunched about the threat’s ability to throw an off boresight shot at us, before we remembered that we could throw one a similar angle off boresight (away from straight forward) using the radar. Then we got bunched again because working the HOTAS and watching a screen whilst manoeuvring hard isn’t quite the same ‘User Experience’ as some form of evil eye attached to your bone dome. The fact is that helmet mounted cue-ing systems changed the game and in many ways wrote a cheque that High Off Boresight (HOBS) weapons cashed. Why strive to get into the Control Zone (that bit of sky behind the enemy from which he cannot eject you kinematically) when you can simply look at the enemy and unleash a AIM-9X or other similar weapon? These weapons are extraordinary. Some can be launched over 90 degrees off boresight. Just picture what that looks like as compared to the WW1 experience of getting to height, finding the enemy and starting to circle. It looks like anything in your bit of airspace to be shot. We no longer need to stop at HMS either. How about targeting an aircraft that you can’t see other than as a track being passed to you via datalink? Can you imagine how annoying it would be to be in danger of winning a 1 v 1 only to soak up a shot that was cue-ed using a data link track from a third fighter?
2 notes · View notes
surajheroblog · 3 months
Text
Creating Expressive Characters: Rigging Best Practices
Tumblr media
Rigging is the unsung hero of 3D animation. It’s the invisible art that gives life to characters, allowing them to move, emote, and interact with their virtual world. In this blog post, we’ll delve into rigging best practices that elevate your characters from stiff puppets to expressive beings. Whether you’re a beginner or a seasoned animator, these tips will enhance your rigging skills and bring your characters to life.
1. Understanding Character Anatomy
Before diving into rigging, understand the anatomy of your character. Study how joints connect, where muscles flex, and how bones move. This knowledge informs your rigging decisions and ensures natural movement.
Rigging the Spine
Use a hierarchical spine rig with controls for each vertebra.
Add squash-and-stretch attributes for flexibility during animations.
Facial Rigging
Rig facial muscles (brows, lips, cheeks) individually.
Blend shapes (morph targets) allow precise facial expressions.
2. Creating Custom Control Rigs
Control rigs are your animator’s toolbox. Let’s explore advanced techniques:
NURBS Curves and Custom Shapes
Create custom NURBS curves as control handles.
Attach them to joints for intuitive manipulation.
IK/FK Switching
Implement IK/FK (inverse kinematics/forward kinematics) switching.
Seamlessly switch between posing (FK) and natural movement (IK).
3. Enhancing Facial Expressions
The face is the canvas of emotions. Rigging expressions adds depth to characters:
Joint-Based Facial Rigging
Use joints for basic facial movements (smiles, blinks).
Combine with blend shapes for nuanced expressions.
Rigging Eyes and Eyelids
Rigging eyes involves aim controllers and eyelid joints.
Simulate eye blinks and squints for realism.
4. Simulating Secondary Motion
Lifelike animations go beyond primary movements. Let’s explore secondary motion:
Cloth Simulations
Rig clothing with cloth simulations (nCloth, Marvelous Designer).
Achieve realistic folds, wrinkles, and draping.
Hair Dynamics
Rig hair with dynamic simulations (nHair, XGen).
Add wind forces for natural hair movement.
Conclusion:
Enroll in our 3D Animation Course in Hyderabad and take your skills to the next level. Whether you’re a beginner or an experienced animator, our expert instructors will guide you toward animation excellence.
0 notes
hopper-miller-lvl6 · 5 months
Text
FMP Refection
Undertaking my final major project thrust me into unfamiliar territory, requiring me to master new programs like Blender and techniques such as inverse kinematics. However, what appeared simple in theory often proved time-consuming in practice, particularly when animating my models. 
My ambition to create a VR application led to a deep dive into VR development, a path that eventually diverged from my final outcome. The journey was not without its challenges. The constant shifting of goals and the realization of skill limitations posed significant hurdles. However, I was able to adapt and recalibrate my expectations, demonstrating a resilience and adaptability that I am proud of. 
One significant hurdle was creating polished animations, a task made more complex by unreliable motion capture and the need for manual animation. However, I was not deterred. Instead, I embraced the challenge and found creative solutions. For instance, I simulated functionality within Blender, a move that not only saved time but also showcased the power of innovative thinking in problem-solving. 
Reflecting on the experience, I recognize the importance of allocating ample learning and development time. Balancing both simultaneously proved frustrating and hindered progress. 
While the project demanded sacrifices in efficiency and polish for deadlines, it bolstered my confidence in rapid learning and problem-solving. Moving forward, I carry newfound confidence in navigating 3D spaces and embracing the challenge of acquiring new skills. 
In hindsight, I acknowledge the value of strategic decision-making and wisely choosing battles. Rather than fixating on one approach, I recognize the benefit of exploring alternative paths that may lead to better outcomes.
0 notes
fmprojectleonardo · 5 months
Text
Information about Rigs
A rig is the skeleton of a model. A rig is what gives the animator power to manipulate and control the movement and deformation of characters or objects. The skeleton made up of joints and bones define how a character and object can move. There are forward Kinematics (FK) and Inverse Kinematics (IK). The rig that I have uses IK which makes the legs automatically bend when the rig reaches a lower level. The arms of the rig do not have this feature so I would have to make it look as natural as possible when a character falls to the ground or hits a wall. Some software's like Maya have auto rigging which automatically rigs the model for the animator. Blender does not have auto rigging built into the system, however you can get plugins for it to have auto rigging.
Tumblr media
0 notes
mcneeart · 6 months
Text
Rigging Terminology
Joint
Bone
Tumblr media
Control
Hierarchy
Parent/Child/Sibling
FK (Forward Kinematics)
IK (Inverse Kinematics)
Blendshape
Constraint
Deformer
Skinning
Ribbons
Gimbal
World vs Local Orientation
Follicles
Rotation Order
Deformation
Cluster
Dual Quaternion
Space Swapping
0 notes