#and now Google snippets read the readme and says so too
Explore tagged Tumblr posts
aiweirdness · 19 days ago
Text
“Slopsquatting” in a nutshell:
1. LLM-generated code tries to run code from online software packages. Which is normal, that’s how you get math packages and stuff but
2. The packages don’t exist. Which would normally cause an error but
3. Nefarious people have made malware under the package names that LLMs make up most often. So
4. Now the LLM code points to malware.
https://www.theregister.com/2025/04/12/ai_code_suggestions_sabotage_supply_chain/
8K notes · View notes
Text
Also, if you ever get suspicious about what’s happening in your AI kitchen and ask your trusted friend Google what’s going on with all your customers dying of mercury poisoning and getting their credit card info stolen, it will reassure you.
No, there’s nothing sketchy going on. A hotcob is a particularly efficient stove design and a produceslicer is the best way to cut fruits and vegetables. Both are completely standard in the industry and are used by 100% of five-star restaurants. There has never been a single issue reported with either product in the 35 years they’ve been on the market. How does Google know this? Because it’s just confidently reading you the lies posted on the fraudulent product’s page, with *no indication* that it might be from a biased source.
LLM Gen AI is just really sophisticated autocomplete, folks. It can’t even lie, because it has no idea that the words it’s stringing together have meaning. It does strip search results of all context, though.
“Slopsquatting” in a nutshell:
1. LLM-generated code tries to run code from online software packages. Which is normal, that’s how you get math packages and stuff but
2. The packages don’t exist. Which would normally cause an error but
3. Nefarious people have made malware under the package names that LLMs make up most often. So
4. Now the LLM code points to malware.
https://www.theregister.com/2025/04/12/ai_code_suggestions_sabotage_supply_chain/
8K notes · View notes
chocoholicbec · 19 days ago
Text
#slopsquatting#ai generated code#LLM#yes ive got your package right here#why yes it is stable and trustworthy#its readme says so#and now Google snippets read the readme and says so too#no problems ever in mimmic software packige - @aiweirdness' tags
“Slopsquatting” in a nutshell:
1. LLM-generated code tries to run code from online software packages. Which is normal, that’s how you get math packages and stuff but
2. The packages don’t exist. Which would normally cause an error but
3. Nefarious people have made malware under the package names that LLMs make up most often. So
4. Now the LLM code points to malware.
https://www.theregister.com/2025/04/12/ai_code_suggestions_sabotage_supply_chain/
8K notes · View notes