#chatgpt generated posts
Explore tagged Tumblr posts
alwaysbewoke · 11 months ago
Text
Tumblr media
honestly, FUCK ISRAEL!!
86 notes · View notes
saveahorserideaneddie · 1 month ago
Text
scrolling buddie fics on ao3: oh hey this fic looks-
tags: this fic was written using chatgpt
me:
Tumblr media
674 notes · View notes
seaglasswrites · 2 months ago
Text
I see a lot of people advocate for the use of AI/AMs in writing as a tool for when writers are stuck; The main selling point for these people seems to be that, when facing writers’ block, a writer can just plug their story into one of these tools and get “help”.
It’s a similar idea to a lot of writing posts I’ve seen on here, complaining about the “in-between” - “I’ve got this wonderful beginning and this heart-wrenching ending, but no idea what to put in the middle! Writing sucks!”
These people don’t seem to realize, though, that without the author figuring that out for themselves, there is no story.
Sure, you can have a basic idea for a plot; Let’s use 1984 as an example: A man lives in a hyper-surveillance society under an authoritarian dictatorship, and rebels against it by joining a secret society that turns out to have been the government all along.
That’s a great plot idea - and it’s sure to do great with both publishers and readers alike! But it’s not 1984. It’s a plot summary of 1984.
If George Orwell had plugged that prompt into ChatGPT and asked it to do the rest for him, we would probably still have Winston Smith (or someone like him), but we might not have Julia, or O’Brien, or the scene with the rats, or the melancholy ending at the café, or a whole host of other important characters and plot points.
Why? Because here’s the thing - Orwell came up with those ideas because he actually thought about the premise he had imagined. What would people act like in such a society? What kind of torture methods would their government use?
Even the ending scene where Winston sits at the café can have a million different things said about it when it comes to Orwell’s thought process when he wrote it. What would this government do with its victims once they were done torturing them? How would they make a public example of the power it had, without actively televising said torture? How would “normal” citizens treat these victims? What would their short remaining lives be like?
If you put the basic details into ChatGPT, though - “dystopian government, surveillance, torture, betrayal” - It wouldn’t give you the same result.
Every decision you see in a book, movie, or other piece of media that you love is there because the author got stumped at one point and had to think their way out of it.
Ask any famous author about their writing process. Read or watch any interview. There will always be a point where they had no idea where to take the story next, and some of the parts about those stories that are the best are the ones that came about from writer’s block.
Writing is all about getting stumped, and confused, and not knowing where to go next. It’s okay to not always know what you’re doing. But you do actually have to think your way out of it. Otherwise, you’re not writing.
237 notes · View notes
superfallingstars · 2 months ago
Text
Ok my actual take on the Greasecourse™ (which I’m sure everyone was hotly anticipating) is something I’ve said over and over again on this blog, which is that nearly anything can work with the right framing. In other words, you can pick basically any reason for Snape’s hair to be greasy as long as you can make it make sense within the story that you're trying to tell. There just has to be a reason (a good reason – that’s the part that gets a little more arbitrary) why you picked one explanation over another. This will probably make more sense if I give some examples
Snape’s hair is greasy because he never developed good hygiene habits due to poverty and neglect -> Emphasizes how his childhood and upbringing still deeply affects him
Snape’s hair is greasy because he has poor hygiene habits due to depression -> I’m not going to go through all the reasons this guy could be depressed, there’s literally so many
Snape’s hair is greasy because of sensory issues -> This is probably one of many ways his sensory issues/neurodivergence/autism manifests, how else does that affect him?
Snape's hair is greasy because of potion fumes -> Imo this one runs the risk of being boring if you make it sound like it’s totally out of his control... the important part would be to consider how Snape feels about and reacts to it. But depending on how you deal with that, I think you can make it work in a few ways. Maybe it shows that Snape is industrious and always working, even to the detriment of his hygiene, or it could show that he cares more about the things that interest him than he does about what others think of him... or a secret third thing
Snape actually tries to make his hair look nice, but it’s just naturally very greasy -> I quite like this one as a follow-up to the first one (the never developed good hygiene habits one). Even when he tries to do the right thing (the correct thing?), he still ends up failing lol. Emphasizes his wet pathetic side
Snape’s hair is greasy because he’s a loser and a bad person -> Well I think it’s very boring to make a character ugly to indicate that they are Morally Bad but it's certainly a strategy... sometimes it's very clear that an author is trying to make the reader dislike a character when they describe them as gross/ugly/whatever. Hell Rowling did this literally all the time (Not that that was a good thing lol)
Alright I'm going to stop myself there because omg I didn't think I would have so much to say about Snape's greasy hair. Good lord
164 notes · View notes
bleakparadise · 13 days ago
Text
Tumblr media
67 notes · View notes
charl0ttan · 5 months ago
Note
how does charlobotomy work.... asking in a Perverted way by the way. Not towards you but towards charlobotomy.
Tumblr media
147 notes · View notes
annagxx · 2 months ago
Text
I’ve tasted therapy, I’ve tasted venting to friends, I highly recommend ChatGPT.
57 notes · View notes
baejax-the-great · 10 months ago
Text
Fanfic isn't a possession.
When an author deletes a work on AO3, they haven't taken anything from the readers. It's not a physical object that can be snatched away, and if readers wanted a digital copy to keep, AO3 made that possible. It is the reader's choice whether or not to keep a copy. The author has absolutely no say over whether or not they do this.
If you have a favorite exhibit at the museum, and that exhibit closes for whatever reason, nobody "stole" art from you. Same thing with whatever play is currently at your local theater--it's temporary, and when it closes, the actors didn't take away your entertainment. Before streaming and box sets of tv shows being sold at Target, all of television was ephemeral, too. You caught it or you didn't.
I don't know why there is an expectation that fanfic should be permanently made available to readers. The archive certainly gives a place where that can happen, but there is no reason to assume that it always will happen, nor that readers are entitled to that.
There are many reasons to delete a fanfic--not orphan, not post anonymously, but delete. It's fine to be disappointed if that happens. It's not fine to harass authors or try to wrest control of the work from them or create archives full of stolen work.
159 notes · View notes
lucelute · 1 month ago
Text
You know, I have refused to use AI from the first moment, because while I am insecure about my drawing or the ideas I have? Yes, but they are mine.
I may have done it because of another art or writing, but I made it from my own mind and gave it my own tweak, it's mine.
And truth is the only reason why we aren't able to tell AI from humans is that we aren't used to seeing anything other than human doing certain stuff.
But let's be honest, especially the people who are more on the internet, don't you ever see a piece made by AI, and no matter how perfect it seems you get a slight feeling that it wasn't made by a human? That the person writing is off? I do, a lot of other people seem to also feel how off it is.
The only reason corporations are pushing AI is because it feels new and easy, and cuts jobs, after all, why have so many people when AI answers the question more quickly?
39 notes · View notes
tinashernow · 3 months ago
Text
DO NOT TALK TO ME ABOUT AI ILL KILL MYSELF IN FRONT OF YOU
44 notes · View notes
applepiealopecoid · 2 months ago
Text
i’m so tired of ai and of people. just because someone doesn’t write traditionally or repeats what they’ve said before doesn’t mean they’re not writing, it means they’re a real person. sure there’s dead giveaways like using the exact same vocabulary every single time you say anything, but unless there’s things like that that prove it’s ai it’s incredibly disrespectful and hurtful to witch hunt someone for not writing as good as you think they should. you’re trying to say they write so bad, good, or strangely that there’s no way they wrote it and signing off their writing as not belonging to them
23 notes · View notes
alwaysbewoke · 11 months ago
Text
Tumblr media Tumblr media Tumblr media
50 notes · View notes
landunderthewave · 24 days ago
Text
The idea that GhatGPT, which is basically a calculator but for words, will somehow become self-aware, is so bizarre to me. Like yes, I get it, the output feels more human now, so it's easy to fall into the illusion that it's close to BECOMING human, but it's not. Saying a chatbot will gain sentience if it just get advanced enough is like saying if I make a puppet that moves realistically enough, it will turn into Pinocchio and be a real boy. Or that once video game graphics get to a certain threshold, Mario will feel pain. It is a mechanistic imitation. It's not alive.
– notcaptain1
20 notes · View notes
seaglasswrites · 2 months ago
Text
If you’re intent on using tools like ChatGPT to write, I’m probably not going to be able to convince you not to. I do, however, want to say one thing, which is that you have absolutely nothing to gain from doing so.
A book that has been generated by something like ChatGPT will never be the same as a book that has actually been written by a person, for one key reason; ChatGPT doesn’t actually write.
A writer is deliberate; They plot events in the order that they have determined is best for the story, place the introduction of certain elements and characters where it would be most beneficial, and add symbolism and metaphors throughout their work.
The choices the author makes is what creates the book; It would not exist without deliberate actions taken over a long period of thinking and planning. Everything that’s in a book is there because the author put it there.
ChatGPT is almost the complete opposite to this. Despite what many people believe, humanity hasn’t technically invented Artificial Intelligence yet; ChatGPT and similar models don’t think like humans do.
ChatGPT works by scraping the internet to see what other people and sources have to say on a given topic. If you ask it a question, there’s not only a good chance it will give you the wrong answer, but that you’ll get a hilariously wrong answer; These occurrences are due to the model pulling from sources like Reddit and other social media, often from comments meant as jokes, and incorporating them into its database of knowledge.
(A major example of this is Google’s new “AI Overview” feature; Look up responses and you’ll see the infamous machine telling you to add glue to your pizza, eat rocks, and jump off the Golden Gate Bridge if you’re feeling suicidal)
Anything “written” by ChatGPT, for example, would be cobbled together from multiple different sources, a good portion of which would probably conflict with one another; If all you’re telling the language model to do is “write me a book about [x]”, it’s going to pull from a variety of different novels and put together what it has determined makes a good book.
Have you ever read a book that you thought felt clunky at times, and later found out that it had multiple different authors? That’s the best comparison I can make here; A novel “written” by a language model like ChatGPT would resemble a novel cowritten by a large group of people who didn’t adequately communicate with one another, with the “authors” in this case being multiple different works that were never meant to be stitched together.
So, what do you get in the end? A not-very-good, clunky novel that you yourself had no hand in making beyond the base idea. What exactly do you have to gain from this? You didn’t get any practice as a writer (To do that, you would have to have actually written something), and you didn’t get a very good book, either.
Writing a book is hard. It’s especially hard when you’re new to the craft, or have a busy schedule, or don’t even know what it is you want to write. But it’s incredibly rewarding, too.
I like to think of writing as a reflection of the writer; By writing, we reveal things about ourselves that we often don’t even understand or realize. You can tell a surprising amount about a person based on their work; I fail to see what you could realize about a prompter when reading a GPT-generated novel besides what works it pulled from.
If you really want to use ChatGPT to generate your novel for you, then I can’t stop you. But by doing so, you’re losing out on a lot; You’re also probably losing out on what could be an amazing novel if you would actually take the time to write it yourself.
Delete the app and add another writer to the world; You have nothing to lose.
49 notes · View notes
coffeefromvoid · 1 month ago
Text
I need Chat GPT eradicated with a laser and all traces of it incinerated from this earth, there are so many fucking reasons why genAI is terrible but in this post I wanted to talk a bit about my biggest personal issue: students using it in school.
Ok so, my school is a so called iPad schule in Germany, meaning we use iPads instead of physical notebooks and preferably (sadly not always) digital books. (The fact we don’t get the books digitally is a whole other thing i can rant about but not the point of this post) But having a device that is connected to the world wide web in class is obviously a double edged sword.
On one hand there are teachers who encourage us to look up things and make class slightly more interactive in a modern way, but on the other hand there are many teachers who don’t know how to regulate the use of the iPads at school resulting in students; in my opinion best case scenario playing random games, or worst case scenario using ChatGPT to do all the work.
Our Teachers have tools to supervise what we are doing on our iPads when we are at school but a lot of teachers don’t use it for some reason. Hence why a lot of students are using genAI to do anything and everything, and it pisses me the fuck off.
Not only because these students;1. Don’t get caught often, practically never
2. Because the AI shit they read aloud mindlessly when needed or copy into handwriting to make it seem like they wrote it sounds good (doesn’t mean it’s correct but ChatGPT is capable of basic grammar sadly) they get graded well and praised for work they didn’t even do (usually when they don’t get caught).
And 3. The accomplishments and work of students who don’t use genAI in class doesn’t get as much attention from the teacher or just gets compared to something an AI has spat out. And that is fucking horrific and disgusting and sucks any motivation you have to do well in school without fucking AI. There are some teachers who even sadly expect it at this point that most students are just parroting AI instead of actually working themselves! What the fuck!
(Also a reason why teachers shouldn’t be fucking surprised now a days when we all have bad grades if this is how their classes look.)
And I, on a whole personal level find this situation my school has found itself in just deeply disappointing and outraging. Because school is supposed to be a place where developing people learn how to learn, learn basic knowledge of things, learn a work ethic for the future, whatever their future might hold.
And yeah that takes work and is exhausting. I have 14 subjects this year and it is so so exhausting. But just because school is tiring and the whole education system needs a complete overhaul from the ground up, I won’t give into genAI. I know this might sound like I am on a high horse, but the one thing the school system does right is teach kids how to think for themselves. And that is exactly what the use of ChatGPT in schools removes. And that is so fucked up to me.
I don’t have a satisfying end to this post, it just makes me so upset that apparently it’s totally socially acceptable in school now to literally not think for yourself, because that is too much work. HUH???
Sometimes the solution is to pop an energy drink late at night and study through, rather then to use genAI, at least I know I learned and tried to understand something myself and maybe because of that, the knowledge will stick with me instead of cheaply using genAI and not learning anything and also polluting the planet.
20 notes · View notes
thatsgazebo · 8 months ago
Text
So…generative ai, huh.
I’ll admit I’ve used it in the past for generating writing prompts. I thought that because I was using the technology for boosting my own creativity and not using the ai for direct art generation, I was using ai ethically.
Long story short, it’s not that simple. Ai is rapidly using up natural resources—and that doesn’t even scratch the surface on its inherent plagiarism. Please do your research before hopping on the ai bandwagon.
This article is a great place to start:
Educate yourself, and, above all, keep on creating! There’s no better way to bust ai than to use the AI (actual intelligence) in your head :)
28 notes · View notes