pinoy hazbin hotel fans can confirm that lucifer is behind the duck hair clip trend
(context: the latest trend here in the Philippines is wearing a duck hair clip on your head or even clipping it on your bag as an accessory.)
Instagram || Twitter || Ko-fi
18 notes
·
View notes
Joanne/Joan Fanart
I made fanart for @artsandstoriesandstuff's new OC Joanne/Joan. They're a housewife who's just a tiny bit...Stressed.
This is a rough MS Paint sketch because I don't have the energy for a proper CSP drawing. And I wanted to get this idea out! I was surprised how similar the sketch was to how I envisioned her when I suggested her, lol.
The quality is kinda shit. Sorry.
No reposting, editing, claiming, stealing, or tracing this. No using it for any program, NFT, or AI.
Hope you like it haha. Just a little thing.
4 notes
·
View notes
wild finding out that people legitimately hate having imagine dragons on the arcane soundtrack. some of the best animation that's come out of the last decade has imagine dragons on the track. listen to sexy downbeats of enemy and think about how sick it would be to see jayce smash someone's head in to it. the imagine dragons is essential to the lol experience and acting like it would be a better show without it is stupid, and more importantly, embarrassing. its like walking into a neighborhood bar that everyone hates and complaining about the one thing people actually kind of like about the original thing right after it got a wicked makeover.
6 notes
·
View notes
the darling Glaze “anti-ai” watermarking system is a grift that stole code/violated GPL license (that the creator admits to). It uses the same exact technology as Stable Diffusion. It’s not going to protect you from LORAs (smaller models that imitate a certain style, character, or concept)
An invisible watermark is never going to work. “De-glazing” training images is as easy as running it through a denoising upscaler. If someone really wanted to make a LORA of your art, Glaze and Nightshade are not going to stop them.
If you really want to protect your art from being used as positive training data, use a proper, obnoxious watermark, with your username/website, with “do not use” plastered everywhere. Then, at the very least, it’ll be used as a negative training image instead (telling the model “don’t imitate this”).
There is never a guarantee your art hasn’t been scraped and used to train a model. Training sets aren’t commonly public. Once you share your art online, you don’t know every person who has seen it, saved it, or drawn inspiration from it. Similarly, you can’t name every influence and inspiration that has affected your art.
I suggest that anti-AI art people get used to the fact that sharing art means letting go of the fear of being copied. Nothing is truly original. Artists have always copied each other, and now programmers copy artists.
Capitalists, meanwhile, are excited that they can pay less for “less labor”. Automation and technology is an excuse to undermine and cheapen human labor—if you work in the entertainment industry, it’s adapt AI, quicken your workflow, or lose your job because you’re less productive. This is not a new phenomenon.
You should be mad at management. You should unionize and demand that your labor is compensated fairly.
10K notes
·
View notes