Tumgik
#& it's not dissimilar to e.g. pigeon and rat psychology experiments
paradife-loft · 2 years
Text
it occurs to me that part of this "chatgpt will make up nonexistent articles and quotes and whatnot and then attribute them to real people" issue - I suspect it has something to do with "referencing and citing Actual Things In the World where appropriate" and "not plagiarizing existing writing" being two goals at cross-purposes for training a program on like, general writing I guess? - when it doesn't already have a background of understanding the difference between "real" and "fake".
like, you have something that understands "here are strings of text! I know a lot about how parts of these strings tend to get put together to form larger blocks of text. I also know that it's important to make sure I don't arrange those blocks or strings of text in ways that directly match existing blocks of highly unique text in my dataset."
so like... it completely makes sense why it'd have trouble contextually with stuff like "citing real scientific article titles"? unless you put it through a much more rigorous training designed specifically to learn what (contextually) makes something an "article title" as opposed to other text, and what distinguishes "citing something [e.g. putting an exact copy of existing unique text into what it generates]" from "plagiarizing".... that's probably not a task the computer is equipped to handle.
12 notes · View notes