Tumgik
#no not everyone sharing specific opinions are psyops. but some of them are
lurkiestvoid · 4 months
Text
You're being targeted by disinformation networks that are vastly more effective than you realize. And they're making you more hateful and depressed.
(This essay was originally by u/walkandtalkk and posted to r/GenZ on Reddit two months ago, and I've crossposted here on Tumblr for convenience because it's relevant and well-written.)
TL;DR: You know that Russia and other governments try to manipulate people online. But you almost certainly don't how just how effectively orchestrated influence networks are using social media platforms to make you -- individually-- angry, depressed, and hateful toward each other. Those networks' goal is simple: to cause Americans and other Westerners -- especially young ones -- to give up on social cohesion and to give up on learning the truth, so that Western countries lack the will to stand up to authoritarians and extremists.
And you probably don't realize how well it's working on you.
This is a long post, but I wrote it because this problem is real, and it's much scarier than you think.
How Russian networks fuel racial and gender wars to make Americans fight one another
In September 2018, a video went viral after being posted by In the Now, a social media news channel. It featured a feminist activist pouring bleach on a male subway passenger for manspreading. It got instant attention, with millions of views and wide social media outrage. Reddit users wrote that it had turned them against feminism.
There was one problem: The video was staged. And In the Now, which publicized it, is a subsidiary of RT, formerly Russia Today, the Kremlin TV channel aimed at foreign, English-speaking audiences.
As an MIT study found in 2019, Russia's online influence networks reached 140 million Americans every month -- the majority of U.S. social media users.
Russia began using troll farms a decade ago to incite gender and racial divisions in the United States
In 2013, Yevgeny Prigozhin, a confidante of Vladimir Putin, founded the Internet Research Agency (the IRA) in St. Petersburg. It was the Russian government's first coordinated facility to disrupt U.S. society and politics through social media.
Here's what Prigozhin had to say about the IRA's efforts to disrupt the 2022 election:
"Gentlemen, we interfered, we interfere and we will interfere. Carefully, precisely, surgically and in our own way, as we know how. During our pinpoint operations, we will remove both kidneys and the liver at once."
In 2014, the IRA and other Russian networks began establishing fake U.S. activist groups on social media. By 2015, hundreds of English-speaking young Russians worked at the IRA. Their assignment was to use those false social-media accounts, especially on Facebook and Twitter -- but also on Reddit, Tumblr, 9gag, and other platforms -- to aggressively spread conspiracy theories and mocking, ad hominem arguments that incite American users.
In 2017, U.S. intelligence found that Blacktivist, a Facebook and Twitter group with more followers than the official Black Lives Matter movement, was operated by Russia. Blacktivist regularly attacked America as racist and urged black users to rejected major candidates. On November 2, 2016, just before the 2016 election, Blacktivist's Twitter urged Black Americans: "Choose peace and vote for Jill Stein. Trust me, it's not a wasted vote."
Russia plays both sides -- on gender, race, and religion
The brilliance of the Russian influence campaign is that it convinces Americans to attack each other, worsening both misandry and misogyny, mutual racial hatred, and extreme antisemitism and Islamophobia. In short, it's not just an effort to boost the right wing; it's an effort to radicalize everybody.
Russia uses its trolling networks to aggressively attack men. According to MIT, in 2019, the most popular Black-oriented Facebook page was the charmingly named "My Baby Daddy Aint Shit." It regularly posts memes attacking Black men and government welfare workers. It serves two purposes: Make poor black women hate men, and goad black men into flame wars.
MIT found that My Baby Daddy is run by a large troll network in Eastern Europe likely financed by Russia.
But Russian influence networks are also also aggressively misogynistic and aggressively anti-LGBT.
On January 23, 2017, just after the first Women's March, the New York Times found that the Internet Research Agency began a coordinated attack on the movement. Per the Times:
More than 4,000 miles away, organizations linked to the Russian government had assigned teams to the Women’s March. At desks in bland offices in St. Petersburg, using models derived from advertising and public relations, copywriters were testing out social media messages critical of the Women’s March movement, adopting the personas of fictional Americans.
They posted as Black women critical of white feminism, conservative women who felt excluded, and men who mocked participants as hairy-legged whiners.
But the Russian PR teams realized that one attack worked better than the rest: They accused its co-founder, Arab American Linda Sarsour, of being an antisemite. Over the next 18 months, at least 152 Russian accounts regularly attacked Sarsour. That may not seem like many accounts, but it worked: They drove the Women's March movement into disarray and eventually crippled the organization.
Russia doesn't need a million accounts, or even that many likes or upvotes. It just needs to get enough attention that actual Western users begin amplifying its content.
A former federal prosecutor who investigated the Russian disinformation effort summarized it like this:
It wasn’t exclusively about Trump and Clinton anymore. It was deeper and more sinister and more diffuse in its focus on exploiting divisions within society on any number of different levels.
As the New York Times reported in 2022,
There was a routine: Arriving for a shift, [Russian disinformation] workers would scan news outlets on the ideological fringes, far left and far right, mining for extreme content that they could publish and amplify on the platforms, feeding extreme views into mainstream conversations.
China is joining in with AI
[A couple months ago], the New York Times reported on a new disinformation campaign. "Spamouflage" is an effort by China to divide Americans by combining AI with real images of the United States to exacerbate political and social tensions in the U.S. The goal appears to be to cause Americans to lose hope, by promoting exaggerated stories with fabricated photos about homeless violence and the risk of civil war.
As Ladislav Bittman, a former Czechoslovakian secret police operative, explained about Soviet disinformation, the strategy is not to invent something totally fake. Rather, it is to act like an evil doctor who expertly diagnoses the patient’s vulnerabilities and exploits them, “prolongs his illness and speeds him to an early grave instead of curing him.”
The influence networks are vastly more effective than platforms admit
Russia now runs its most sophisticated online influence efforts through a network called Fabrika. Fabrika's operators have bragged that social media platforms catch only 1% of their fake accounts across YouTube, Twitter, TikTok, and Telegram, and other platforms.
But how effective are these efforts? By 2020, Facebook's most popular pages for Christian and Black American content were run by Eastern European troll farms tied to the Kremlin. And Russia doesn't just target angry Boomers on Facebook. Russian trolls are enormously active on Twitter. And, even, on Reddit.
It's not just false facts
The term "disinformation" undersells the problem. Because much of Russia's social media activity is not trying to spread fake news. Instead, the goal is to divide and conquer by making Western audiences depressed and extreme.
Sometimes, through brigading and trolling. Other times, by posting hyper-negative or extremist posts or opinions about the U.S. the West over and over, until readers assume that's how most people feel. And sometimes, by using trolls to disrupt threads that advance Western unity.
As the RAND think tank explained, the Russian strategy is volume and repetition, from numerous accounts, to overwhelm real social media users and create the appearance that everyone disagrees with, or even hates, them. And it's not just low-quality bots. Per RAND,
Russian propaganda is produced in incredibly large volumes and is broadcast or otherwise distributed via a large number of channels. ... According to a former paid Russian Internet troll, the trolls are on duty 24 hours a day, in 12-hour shifts, and each has a daily quota of 135 posted comments of at least 200 characters.
What this means for you
You are being targeted by a sophisticated PR campaign meant to make you more resentful, bitter, and depressed. It's not just disinformation; it's also real-life human writers and advanced bot networks working hard to shift the conversation to the most negative and divisive topics and opinions.
It's why some topics seem to go from non-issues to constant controversy and discussion, with no clear reason, across social media platforms. And a lot of those trolls are actual, "professional" writers whose job is to sound real.
So what can you do? To quote WarGames: The only winning move is not to play. The reality is that you cannot distinguish disinformation accounts from real social media users. Unless you know whom you're talking to, there is a genuine chance that the post, tweet, or comment you are reading is an attempt to manipulate you -- politically or emotionally.
Here are some thoughts:
Don't accept facts from social media accounts you don't know. Russian, Chinese, and other manipulation efforts are not uniform. Some will make deranged claims, but others will tell half-truths. Or they'll spin facts about a complicated subject, be it the war in Ukraine or loneliness in young men, to give you a warped view of reality and spread division in the West.
Resist groupthink. A key element of manipulate networks is volume. People are naturally inclined to believe statements that have broad support. When a post gets 5,000 upvotes, it's easy to think the crowd is right. But "the crowd" could be fake accounts, and even if they're not, the brilliance of government manipulation campaigns is that they say things people are already predisposed to think. They'll tell conservative audiences something misleading about a Democrat, or make up a lie about Republicans that catches fire on a liberal server or subreddit.
Don't let social media warp your view of society. This is harder than it seems, but you need to accept that the facts -- and the opinions -- you see across social media are not reliable. If you want the news, do what everyone online says not to: look at serious, mainstream media. It is not always right. Sometimes, it screws up. But social media narratives are heavily manipulated by networks whose job is to ensure you are deceived, angry, and divided.
Edited for typos and clarity. (Tumblr-edited for formatting and to note a sourced article is now older than mentioned in the original post. -LV)
P.S. Apparently, this post was removed several hours ago due to a flood of reports. Thank you to the r/GenZ moderators for re-approving it.
Second edit:
This post is not meant to suggest that r/GenZ is uniquely or especially vulnerable, or to suggest that a lot of challenges people discuss here are not real. It's entirely the opposite: Growing loneliness, political polarization, and increasing social division along gender lines is real. The problem is that disinformation and influence networks expertly, and effectively, hijack those conversations and use those real, serious issues to poison the conversation. This post is not about left or right: Everyone is targeted.
(Further Tumblr notes: since this was posted, there have been several more articles detailing recent discoveries of active disinformation/influence and hacking campaigns by Russia and their allies against several countries and their respective elections, and barely touches on the numerous Tumblr blogs discovered to be troll farms/bad faith actors from pre-2016 through today. This is an ongoing and very real problem, and it's nowhere near over.
A quote from NPR article linked above from 2018 that you might find familiar today: "[A] particular hype and hatred for Trump is misleading the people and forcing Blacks to vote Killary. We cannot resort to the lesser of two devils. Then we'd surely be better off without voting AT ALL," a post from the account said.")
159 notes · View notes
aahsoka · 2 months
Text
The best vaccine for propaganda is developing critical thinking skills. This is NOT flat out ignoring anything that goes against your opinion, or believing that anyone who holds a dissenting opinion from your own is a bot programmed to interfere with the election or an idiot. Frankly, even right-wing voters are not idiots. There are logical reasons that they vote the way they do, even if you do not agree with their logic.
I want to talk about the 2016 Russian Election Interference on Tumblr specifically:
In 2018, 84 blogs were deleted by Tumblr after a joint investigation between them and the Department of Justice determined they were linked to the IRA (Internet Research Agency) an organization indicted for running troll farms for the Russian Government. Here is the post made by Tumblr
Now I am going to ask you some questions:
What is the difference between a bot and a troll? Is that difference important to acknowledge?
Do we as users/citizens have a full understanding of why these blogs were determined to be connected to the IRA? Do we know the process the DOJ and Tumblr used to determine their link to the troll farm?
Supposedly many posed as Black Activists. But how do you decide who is a Blackfishing Russian troll and who is a genuine Black American? Were some of these blogs Russian trolls, and some that were deleted for reblogging from them actual Black people who genuinely agreed with their posts, regardless of its origins? What was the functional difference between their posts and the posts made by actual Black users about Black issues? Do you think a Russian agent is capable of effectively pretending to be a Black American? What about today: how do you know if the person with a dissenting opinion is a disgruntled US citizen or a troll? “But even if some of the information posted was false, It is impossible to know if that was done with malicious intent or by mistakenly re-sharing fake news.”
One post shown in this analysis includes a headline posted without a link; are you taking the time to investigate the claim a post is talking about? One post was reposted from an actual Black user on Twitter; How much of this ‘trolling’ were reposts from real Black users on other sites? What is the functional difference, especially if the original user is credited?
Do you think this strategy would be as easy to implement now that most sites know to watch out for it?
Do you understand why some people do not trust government investigative bodies to be fair towards Black activists?
How effective was this election interference on tumblr and elsewhere? Are there other reasons for a lack of votes from certain demographics related to the US electoral system itself?
When you talk about the Russian Election Interference are you approaching it in a way that frames everyone who has a dissenting opinion as a bad actor employed by the Russian government? Do you understand why this makes some people extremely uncomfortable? Do you understand why it can read as red scare-type propaganda?
What are Russia’s actual current political goals? How are Russian citizens different from their government?
How and when has the US committed its own psyops within itself and in foreign states? Are you just as wary of propaganda that originates within the United States?
Do people have the right to voice extreme opinions about their government?
Are you getting your information solely from tumblr? Do you believe others are? Why?
I don’t want you to reblog and answer (that would take forever). I just want you to consider what your own opinions and thoughts are on these subjects, to research where you lack knowledge, and to consider how you can approach people with dissenting opinions with sympathy when they are acting in good faith.
45 notes · View notes
ardenttheories · 4 years
Text
vriskas-8log replied to your photo “@vriskas-8log While I'm not the person who compiled the original post,...”
i mean im gonna be honest here, I'm a transfem vriska kinnie, and while clearly vriska has done a whole lotta shit wrong, i dont find specifically the trans reading of the tavvrissprite scene objectionable? like clearly it wasnt intended as a trans narrative, hussie was not at all thinking about trans stuff when writing homestuck, least of all with vriska, but it is completely fair to read some aspects of her story in-comic as trans coding.
like do not get me wrong i fucking despise homestuck 2, honestly mainly because the way its treating vriska is awful and out of character and uncritically idolising her, but like vriska trans is fine actually. and kate is a piece of shit but like idk, i dont really care that she hates gamzee who very much is a shitty character (though like the way the epilogues treated him as even more of an awful anti-black stereotype is uhh, fucking yikes can we not)
and holy shit that fucking charlotte clymer take holy shit what the fuck news flash: trans women can have bad opinions kate, ~~you do it all the time~~ also like, is blaire white a CIA psyop? no shes just a shitty person who happens to be trans it fucking happens
Sorry about the delay, I had to find an actual, working computer since mine seems to not want to find the internet.
I definitely don’t think transwoman Vriska is an objectionable concept. Reading her as a transwoman in Homestuck is a completely valid potential, and in general I don’t actually object the idea that people pick up things that stick out to them as experiences they share, even if Hussie didn’t code them that way (I mean, for the longest time, that’s the best way we got any gender-based representation, and as a transman you can sure as fuck bet I clung to any vague potential of a person being coded as transmasc). Vriska does share a lot of trans experiences, especially pointed out in her Pesterquest route, and I think in general the idea that she’s being more canonically accepted as transfem is cool as fuck. 
What I do find objectionable, however, is Kate taking a scene where Vriska is clearly uncomfortable being near her abuse victim, thinks so lowly of him that the idea of being fused with him is outright despicable to her, and then says it’s actually about Vriska being trans. Not that it could ALSO be seen as evidence of Vriska being trans, but that it is the ONLY interpretation of that scene. The implications it hosts there, especially ABOUT transwomen, are not great. It inherently associates transwomen with abusive women and their discomfort of facing their victims. It also completely overwrites the fact that Vriska, as a person, still views Tavros as sub-human. It completely overwrites Tavros as an abuse victim and Vriska as his abuser.
While it can definitely be evidence towards Vriska being transfem, the idea that Kate is saying “actually, you CAN’T read it this way, it has to be done THIS way or you’re transmisogynistic” is just. Deeply sickening. It’s just her once again trying to overwrite the fact that Vriska has done some extremely shitty things with something completely unrelated because she refuses to acknowledge that Vriska can actually be a bad person. It’s that sort of rewriting of Homestuck that really rubs me the wrong way, not so much the concept that someone is reading Vriska as a transwoman. 
To be completely honest, I gave up trying to keep up with HS^2 a long time ago. I’ve sort of blanked anything Gamzee did because it made me deeply uncomfortable and just felt like a bald-faced mockery more than anything else. But hm!! I might need to go back over it again since you pointed out he’s used as an anti-black stereotype. I know there was an ask I got once about the inherent racism of HS^2 and its writers, but the person never got back to me with evidence - but that seems like a good fucking place to start.
Yeah I’m! I’m really still sort of reeling from the whole “this woman is a cis woman, actually, and it’s a CIA coverup” thing. I’ve got no idea who she’s talking about, which is partly why I didn’t go over it in depth, but. That’s a pretty bold conspiracy theory for her to be spouting. Also highkey deeply transphobic, which I’m still kind of fucking losing my mind over. She suffers from such deeply mired internalised transphobia it’s not even funny, and then turns around and claims everyone else, actually, has internalised transphobia. Like you can’t just. Devalue an entire person’s gender identity because you don’t agree with them. 
20 notes · View notes