thinking about wallace wells. thinking about going through hook ups like tissue paper, never believing anybody will stick around besides scott, who's here only because he has nowhere else to go, and you let him stay anyway even though he doesnt pay the rent. one of the only consistent people in your life, someone you might've actually genuinely liked straight up dying and leaving you with a sudden void of an empty apartment and a cold spot in a futon. thinking about immediately getting wasted and bringing a guy home, someone whose name you won't remember but it's okay because youre only in it for the sex— you dont believe in sparks, after all. believing that scott's conception of his one true girl was a joke because you just don't think you'll ever love anyone like that. kissing someone on a movie set because it's something to do, because he's dressed in the costume of somebody you cared for, because it's all manufactured, false realities and layers of separation deep enough for you to brush off his pleas for connection. thinking about going to paris after everything, the city of love, as tacky as that is, saying you're only there to spend money. but despite the insistence on irony you meet a guy— a fellow canadian, actually, twin foreigners in an unfamiliar place. someone who actually wants to stick around, who follows you through the city to see the sights and seems to genuinely like you. it can't be genuine, though— can't possibly be a reason to stay beyond a few hookups. so you stop at the river and you kiss him to get it over with...
but instead you see sparks.
5K notes
·
View notes
When Facebook came for your battery, feudal security failed
When George Hayward was working as a Facebook data-scientist, his bosses ordered him to run a “negative test,” updating Facebook Messenger to deliberately drain users’ batteries, in order to determine how power-hungry various parts of the apps were. Hayward refused, and Facebook fired him, and he sued:
https://nypost.com/2023/01/28/facebook-fires-worker-who-refused-to-do-negative-testing-awsuit/
If you’d like an essay-formatted version of this post to read or share, here’s a link to it on pluralistic.net, my surveillance-free, ad-free, tracker-free blog:
https://pluralistic.net/2023/02/05/battery-vampire/#drained
Hayward balked because he knew that among the 1.3 billion people who use Messenger, some would be placed in harm’s way if Facebook deliberately drained their batteries — physically stranded, unable to communicate with loved ones experiencing emergencies, or locked out of their identification, payment method, and all the other functions filled by mobile phones.
As Hayward told Kathianne Boniello at the New York Post, “Any data scientist worth his or her salt will know, ‘Don’t hurt people…’ I refused to do this test. It turns out if you tell your boss, ‘No, that’s illegal,’ it doesn’t go over very well.”
Negative testing is standard practice at Facebook, and Hayward was given a document called “How to run thoughtful negative tests” regarding which he said, “I have never seen a more horrible document in my career.”
We don’t know much else, because Hayward’s employment contract included a non-negotiable binding arbitration waiver, which means that he surrendered his right to seek legal redress from his former employer. Instead, his claim will be heard by an arbitrator — that is, a fake corporate judge who is paid by Facebook to decide if Facebook was wrong. Even if he finds in Hayward’s favor — something that arbitrators do far less frequently than real judges do — the judgment, and all the information that led up to it, will be confidential, meaning we won’t get to find out more:
https://pluralistic.net/2022/06/12/hot-coffee/#mcgeico
One significant element of this story is that the malicious code was inserted into Facebook’s app. Apps, we’re told, are more secure than real software. Under the “curated computing” model, you forfeit your right to decide what programs run on your devices, and the manufacturer keeps you safe. But in practice, apps are just software, only worse:
https://pluralistic.net/2022/06/23/peek-a-boo/#attack-helicopter-parenting
Apps are part what Bruce Schneier calls “feudal security.” In this model, we defend ourselves against the bandits who roam the internet by moving into a warlord’s fortress. So long as we do what the warlord tells us to do, his hired mercenaries will keep us safe from the bandits:
https://locusmag.com/2021/01/cory-doctorow-neofeudalism-and-the-digital-manor/
But in practice, the mercenaries aren’t all that good at their jobs. They let all kinds of badware into the fortress, like the “pig butchering” apps that snuck into the two major mobile app stores:
https://arstechnica.com/information-technology/2023/02/pig-butchering-scam-apps-sneak-into-apples-app-store-and-google-play/
It’s not merely that the app stores’ masters make mistakes — it’s that when they screw up, we have no recourse. You can’t switch to an app store that pays closer attention, or that lets you install low-level software that monitors and overrides the apps you download.
Indeed, Apple’s Developer Agreement bans apps that violate other services’ terms of service, and they’ve blocked apps like OG App that block Facebook’s surveillance and other enshittification measures, siding with Facebook against Apple device owners who assert the right to control how they interact with the company:
https://pluralistic.net/2022/12/10/e2e/#the-censors-pen
When a company insists that you must be rendered helpless as a condition of protecting you, it sets itself up for ghastly failures. Apple’s decision to prevent every one of its Chinese users from overriding its decisions led inevitably and foreseeably to the Chinese government ordering Apple to spy on those users:
https://pluralistic.net/2022/11/11/foreseeable-consequences/#airdropped
Apple isn’t shy about thwarting Facebook’s business plans, but Apple uses that power selectively — they blocked Facebook from spying on Iphone users (yay!) and Apple covertly spied on its customers in exactly the same way as Facebook, for exactly the same purpose, and lied about it:
https://pluralistic.net/2022/11/14/luxury-surveillance/#liar-liar
The ultimately, irresolvable problem of Feudal Security is that the warlord’s mercenaries will protect you against anyone — except the warlord who pays them. When Apple or Google or Facebook decides to attack its users, the company’s security experts will bend their efforts to preventing those users from defending themselves, turning the fortress into a prison:
https://pluralistic.net/2022/10/20/benevolent-dictators/#felony-contempt-of-business-model
Feudal security leaves us at the mercy of giant corporations — fallible and just as vulnerable to temptation as any of us. Both binding arbitration and feudal security assume that the benevolent dictator will always be benevolent, and never make a mistake. Time and again, these assumptions are proven to be nonsense.
Image:
Anthony Quintano (modified)
https://commons.wikimedia.org/wiki/File:Mark_Zuckerberg_F8_2018_Keynote_%2841118890174%29.jpg
CC BY 2.0:
https://creativecommons.org/licenses/by/2.0/deed.en
[Image ID: A painting depicting the Roman sacking of Jerusalem. The Roman leader's head has been replaced with Mark Zuckerberg's head. The wall has Apple's 'Think Different' wordmark and an Ios 'low battery' icon.]
Next week (Feb 8-17), I'll be in Australia, touring my book *Chokepoint Capitalism* with my co-author, Rebecca Giblin. We'll be in Brisbane on Feb 8, and then we're doing a remote event for NZ on Feb 9. Next is Melbourne, Sydney and Canberra. I hope to see you!
https://chokepointcapitalism.com/
4K notes
·
View notes