#algorithmic surveillance
Explore tagged Tumblr posts
Text
#disciplinary society#digital age#surveillance culture#data control#digital governance#algorithmic surveillance#Foucauldian theory#modern panopticon#digital behavior regulation
0 notes
Note
As cameras becomes more normalized (Sarah Bernhardt encouraging it, grifters on the rise, young artists using it), I wanna express how I will never turn to it because it fundamentally bores me to my core. There is no reason for me to want to use cameras because I will never want to give up my autonomy in creating art. I never want to become reliant on an inhuman object for expression, least of all if that object is created and controlled by manufacturing companies. I paint not because I want a painting but because I love the process of painting. So even in a future where everyone’s accepted it, I’m never gonna sway on this.
if i have to explain to you that using a camera to take a picture is not the same as using generative ai to generate an image then you are a fucking moron.
#ask me#anon#no more patience for this#i've heard this for the past 2 years#“an object created and controlled by companies” anon the company cannot barge into your home and take your camera away#or randomly change how it works on a whim. you OWN the camera that's the whole POINT#the entire point of a camera is that i can control it and my body to produce art. photography is one of the most PHYSICAL forms of artmakin#you have to communicate with your space and subjects and be conscious of your position in a physical world.#that's what makes a camera a tool. generative ai (if used wholesale) is not a tool because it's not an implement that helps you#do a task. it just does the task for you. you wouldn't call a microwave a “tool”#but most importantly a camera captures a REPRESENTATION of reality. it captures a specific irreproducible moment and all its data#read Roland Barthes: Studium & Punctum#generative ai creates an algorithmic IMITATION of reality. it isn't truth. it's the average of truths.#while conceptually that's interesting (if we wanna get into media theory) but that alone should tell you why a camera and ai aren't the sam#ai is incomparable to all previous mediums of art because no medium has ever solely relied on generative automation for its creation#no medium of art has also been so thoroughly constructed to be merged into online digital surveillance capitalism#so reliant on the collection and commodification of personal information for production#if you think using a camera is “automation” you have worms in your brain and you need to see a doctor#if you continue to deny that ai is an apparatus of tech capitalism and is being weaponized against you the consumer you're delusional#the fact that SO many tumblr lefists are ready to defend ai while talking about smashing the surveillance state is baffling to me#and their defense is always “well i don't engage in systems that would make me vulnerable to ai so if you own an apple phone that's on you”#you aren't a communist you're just self-centered
629 notes
·
View notes
Text
The cod-Marxism of personalized pricing

Picks and Shovels is a new, standalone technothriller starring Marty Hench, my two-fisted, hard-fighting, tech-scam-busting forensic accountant. You can pre-order it on my latest Kickstarter, which features a brilliant audiobook read by Wil Wheaton.
The social function of the economics profession is to explain, over and over again, that your boss is actually right and that you don't really want the things you want, and you're secretly happy to be abused by the system. If that wasn't true, why would your "choose" commercial surveillance, abusive workplaces and other depredations?
In other words, economics is the "look what you made me do" stick that capitalism uses to beat us with. We wouldn't spy on you, rip you off or steal your wages if you didn't choose to use the internet, shop with monopolists, or work for a shitty giant company. The technical name for this ideology is "public choice theory":
https://pluralistic.net/2022/06/05/regulatory-capture/
Of all the terrible things that economists say we all secretly love, one of the worst is "price discrimination." This is the idea that different customers get charged different amounts based on the merchant's estimation of their ability to pay. Economists insist that this is "efficient" and makes us all better off. After all, the marginal cost of filling the last empty seat on the plane is negligible, so why not sell that seat for peanuts to a flier who doesn't mind the uncertainty of knowing whether they'll get a seat at all? That way, the airline gets extra profits, and they split those profits with their customers by lowering prices for everyone. What's not to like?
Plenty, as it turns out. With only four giant airlines who've carved up the country so they rarely compete on most routes, why would an airline use their extra profits to lower prices, rather than, say, increasing their dividends and executive bonuses?
For decades, the airline industry was the standard-bearer for price discrimination. It was basically impossible to know how much a plane ticket would cost before booking it. But even so, airlines were stuck with comparatively crude heuristics to adjust their prices, like raising the price of a ticket that didn't include a Saturday stay, on the assumption that this was a business flyer whose employer was footing the bill:
https://pluralistic.net/2024/06/07/drip-drip-drip/#drip-off
With digitization and mass commercial surveillance, we've gone from pricing based on context (e.g. are you buying your ticket well in advance, or at the last minute?) to pricing based on spying. Digital back-ends allow vendors to ingest massive troves of commercial surveillance data from the unregulated data-broker industry to calculate how desperate you are, and how much money you have. Then, digital front-ends – like websites and apps – allow vendors to adjust prices in realtime based on that data, repricing goods for every buyer.
As digital front-ends move into the real world (say, with digital e-ink shelf-tags in grocery stores), vendors can use surveillance data to reprice goods for ever-larger groups of customers and types of merchandise. Grocers with e-ink shelf tags reprice their goods thousands of times, every day:
https://pluralistic.net/2024/03/26/glitchbread/#electronic-shelf-tags
Here's where an economist will tell you that actually, your boss is right. Many groceries are perishable, after all, and e-ink shelf tags allow grocers to reprice their goods every minute or two, so yesterday's lettuce can be discounted every fifteen minutes through the day. Some customers will happily accept a lettuce that's a little gross and liztruss if it means a discount. Those customers get a discount, the lettuce isn't thrown out at the end of the day, and everyone wins, right?
Well, sure, if. If the grocer isn't part of a heavily consolidated industry where competition is a distant memory and where grocers routinely collude to fix prices. If the grocer doesn't have to worry about competitors, why would they use e-ink tags to lower prices, rather than to gouge on prices when demand surges, or based on time of day (e.g. making frozen pizzas 10% more expensive from 6-8PM)?
And unfortunately, groceries are one of the most consolidated sectors in the modern world. What's more, grocers keep getting busted for colluding to fix prices and rip off shoppers:
https://www.cbc.ca/news/business/loblaw-bread-price-settlement-1.7274820
Surveillance pricing is especially pernicious when it comes to apps, which allow vendors to reprice goods based not just on commercially available data, but also on data collected by your pocket distraction rectangle, which you carry everywhere, do everything with, and make privy to all your secrets. Worse, since apps are a closed platform, app makers can invoke IP law to criminalize anyone who reverse-engineers them to figure out how they're ripping you off. Removing the encryption from an app is a potential felony punishable by a five-year prison sentence and a $500k fine (an app is just a web-page skinned in enough IP to make it a crime to install a privacy blocker on it):
https://pluralistic.net/2024/08/15/private-law/#thirty-percent-vig
Large vendors love to sell you shit via their apps. With an app, a merchant can undetectably change its prices every few seconds, based on its estimation of your desperation. Uber pioneered this when they tweaked the app to raise the price of a taxi journey for customers whose batteries were almost dead. Today, everyone's getting in on the act. McDonald's has invested in a company called Plexure that pitches merchants on the use case of raising the cost of your normal breakfast burrito by a dollar on the day you get paid:
https://pluralistic.net/2024/06/05/your-price-named/#privacy-first-again
Surveillance pricing isn't just a matter of ripping off customers, it's also a way to rip off workers. Gig work platforms use surveillance pricing to titrate their wage offers based on data they buy from data brokers and scoop up with their apps. Veena Dubal calls this "algorithmic wage discrimination":
https://pluralistic.net/2023/04/12/algorithmic-wage-discrimination/#fishers-of-men
Take nurses: increasingly, American hospitals are firing their waged nurses and replacing them with gig nurses who are booked in via an app. There's plenty of ways that these apps abuse nurses, but the most ghastly is in how they price nurses' wages. These apps buy nurses' financial data from data-brokers so they can offer lower wages to nurses with lots of credit card debt, on the grounds that crushing debt makes nurses desperate enough to accept a lower wage:
https://pluralistic.net/2024/12/18/loose-flapping-ends/#luigi-has-a-point
This week, the excellent Lately podcast has an episode on price discrimination, in which cohost Vass Bednar valiantly tries to give economists their due by presenting the strongest possible case for charging different prices to different customers:
https://www.theglobeandmail.com/podcasts/lately/article-the-end-of-the-fixed-price/
Bednar really tries, but – as she later agrees – this just isn't a very good argument. In fact, the only way charging different prices to different customers – or offering different wages to different workers – makes sense is if you're living in a socialist utopia.
After all, a core tenet of Marxism is "from each according to his ability, to each according to his needs." In a just society, people who need more get more, and people who have less, pay less:
https://en.wikipedia.org/wiki/From_each_according_to_his_ability,_to_each_according_to_his_needs
Price discrimination, then, is a Bizarro-world flavor of cod-Marxism. Rather than having a democratically accountable state that sets wages and prices based on need and ability, price discrimination gives this authority to large firms with pricing power, no regulatory constraints, and unlimited access to surveillance data. You couldn't ask for a neater example of the maxim that "What matters isn't what technology does. What matters is who it does it for; and who it does it to."
Neoclassical economists say that all of this can be taken care of by the self-correcting nature of markets. Just give consumers and workers "perfect information" about all the offers being made for their labor or their business, and things will sort themselves out. In the idealized models of perfectly spherical cows of uniform density moving about on a frictionless surface, this does work out very well:
https://pluralistic.net/2023/04/03/all-models-are-wrong/#some-are-useful
But while large companies can buy the most intimate information imaginable about your life and finances, IP law lets them capture the state and use it to shut down any attempts you make to discover how they operate. When an app called Para offered Doordash workers the ability to preview the total wage offered for a job before they accepted it, Doordash threatened them with eye-watering legal penalties, then threw dozens of full-time engineers at them, changing the app several times per day to shut out Para:
https://pluralistic.net/2021/08/07/hr-4193/#boss-app
And when an Austrian hacker called Mario Zechner built a tool to scrape online grocery store prices – discovering clear evidence of price-fixing conspiracies in the process – he was attacked by the grocery cartel for violating their "IP rights":
https://pluralistic.net/2023/09/17/how-to-think-about-scraping/
This is Wilhoit's Law in action:
Conservatism consists of exactly one proposition, to wit: There must be in-groups whom the law protects but does not bind, alongside out-groups whom the law binds but does not protect.
https://en.wikipedia.org/wiki/Francis_M._Wilhoit#Wilhoit's_law
Of course, there wouldn't be any surveillance pricing without surveillance. When it comes to consumer privacy, America is a no-man's land. The last time Congress passed a new consumer privacy law was in 1988, when they enacted the Video Privacy Protection Act, which bans video-store clerks from revealing which VHS cassettes you take home. Congress has not addressed a single consumer privacy threat since Die Hard was still playing in theaters.
Corporate bullies adore a regulatory vacuum. The sleazy data-broker industry that has festered and thrived in the absence of a modern federal consumer privacy law is absolutely shameless. For example, every time an app shows you an ad, your location is revealed to dozens of data-brokers who pretend to be bidding for the right to show you an ad. They store these location data-points and combine them with other data about you, which they sell to anyone with a credit card, including stalkers, corporate spies, foreign governments, and anyone hoping to reprice their offerings on the basis of your desperation:
https://www.404media.co/candy-crush-tinder-myfitnesspal-see-the-thousands-of-apps-hijacked-to-spy-on-your-location/
Under Biden, the outgoing FTC did incredible work to fill this gap, using its authority under Section 5 of the Federal Trade Commission Act (which outlaws "unfair and deceptive" practices) to plug some of the worst gaps in consumer privacy law:
https://pluralistic.net/2024/07/24/gouging-the-all-seeing-eye/#i-spy
And Biden's CFPB promulgated a rule that basically bans data brokers:
https://pluralistic.net/2024/06/10/getting-things-done/#deliverism
But now the burden of enforcing these rules falls to Trump's FTC, whose new chairman has vowed to end the former FTC's "war on business." What America desperately needs is a new privacy law, one that has a private right of action (so that individuals and activist groups can sue without waiting for a public enforcer to take up their causes) and no "pre-emption" (so that states can pass even stronger privacy laws):
https://www.eff.org/deeplinks/2022/07/federal-preemption-state-privacy-law-hurts-everyone
How will we get that law? Through a coalition. After all, surveillance pricing is just one of the many horrors that Americans have to put up with thanks to America's privacy law gap. The "privacy first" theory goes like this: if you're worried about social media's impact on teens, or women, or old people, you should start by demanding a privacy law. If you're worried about deepfake porn, you should start by demanding a privacy law. If you're worried about algorithmic discrimination in hiring, lending, or housing, you should start by demanding a privacy law. If you're worried about surveillance pricing, you should start by demanding a privacy law. Privacy law won't entirely solve all these problems, but none of them would be nearly as bad if Congress would just get off its ass and catch up with the privacy threats of the 21st century. What's more, the coalition of everyone who's worried about all the harms that arise from commercial surveillance is so large and powerful that we can get Congress to act:
https://pluralistic.net/2023/12/06/privacy-first/#but-not-just-privacy
Economists, meanwhile, will line up to say that this is all unnecessary. After all, you "sold" your privacy when you clicked "I agree" or walked under a sign warning you that facial recognition was in use in this store. The market has figured out what you value privacy at, and it turns out, that value is nothing. Any kind of privacy law is just a paternalistic incursion on your "freedom to contract" and decide to sell your personal information. It is "market distorting."
In other words, your boss is right.
Check out my Kickstarter to pre-order copies of my next novel, Picks and Shovels!
If you'd like an essay-formatted version of this post to read or share, here's a link to it on pluralistic.net, my surveillance-free, ad-free, tracker-free blog:
https://pluralistic.net/2025/01/11/socialism-for-the-wealthy/#rugged-individualism-for-the-poor
Image: Cryteria (modified) https://commons.wikimedia.org/wiki/File:HAL9000.svg
CC BY 3.0 https://creativecommons.org/licenses/by/3.0/deed.en
--
Ser Amantio di Nicolao (modified) https://commons.wikimedia.org/wiki/File:Safeway_supermarket_interior,_Fairfax_County,_Virginia.jpg
CC BY-SA 3.0 https://creativecommons.org/licenses/by-sa/3.0/deed.en
#pluralistic#personalized pricing#surveillance pricing#ad-tech#realtime bidding#rtb#404media#price discrimination#economics#neoclassical economics#efficiency#predatory pricing#surveillance#privacy#wage theft#algorithmic wage discrimination#veena dubal#privacy first
288 notes
·
View notes
Text
saw this post about a transformers tma au --
aaand the hamster started running and now i have this and also 1 billion more soundwave headcanons
wordless ver under cut
#transformers#soundwave#soundwave transformers#the magnus archives au#tma au#humanformers#...sort of#magnus archives IS the au this is not an au of the magnus archives#sorry jmarters etc i apologize for the intrusion#the eye#soundwave's endless desire for more knowledge is his own downfall#yknow?? the vision is it visioning#he's the fear of surveillance online#of the algorithm that knows you better than yourself#he takes but never gives#information. lives. love? 🤨#anyway his normie human ear got partially blasted off so beholding gave him some Cool New Antennae#draw robots they said robots dont have clothing folds they said
51 notes
·
View notes
Text
One of the more abstract but dire consequences of this streaming mentality is that we’ve started to treat art and culture like wallpaper. The rise of algorithmic curation and AI-generated content has sent this into overdrive: On Spotify, music is detached from its human creators and flattened into algorithmically-generated playlists with hashtag-able labels like “Lo-Fi Chillwave Anime Vibes.” Netflix has even started dictating that producers make TV shows less engaging, so that people can passively consume them as “second screen content” while scrolling on their phones.
In her recent book Mood Machine: The Rise of Spotify and the Cost of the Perfect Playlist, music journalist Liz Pelly refers to this process as “Muzak-ing”—the conversion of media from discrete works of art with a discernible context and author to anonymous background noise meant for passive consumption at the gym or while relaxing at home.
“It turns out that playlists have spawned a new type of music listener, one who thinks less about the artist or album they are seeking out, and instead connects with emotions, moods and activities, where they just pick a playlist and let it roll,” Pelly wrote in an essay for The Baffler. “These algorithmically designed playlists, in other words, have seized on an audience of distracted, perhaps overworked, or anxious listeners whose stress-filled clicks now generate anesthetized, algorithmically designed playlists.”
Digital Packratting is the antithesis of this trend. It requires intentional curation, because you’re limited by the amount of free space on your media server and devices—and the amount of space in your home you’re willing to devote to this crazy endeavor. Every collection becomes deeply personal, and that’s beautiful. It reminds me of when I was in college and everyone in my dorm was sharing their iTunes music libraries on the local network. I discovered so many new artists by opening up that ugly app and simply browsing through my neighbors’ collections. I even made some new friends. Mix CDs were exchanged, and browsing through unfamiliar microgenres felt like falling down a rabbit hole into a new world.
While streaming platforms flatten music-listening into a homogenous assortment of vibes, listening to an album you’ve downloaded on Bandcamp or receiving a mix from a friend feels more like forging a connection with artists and people. As a musician, I’d much rather have people listen to my music this way. Having people download your music for free on Soulseek is still considered a badge of honor in my producer/dj circles.
I don’t expect everyone to read this and immediately go back to hoarding mp3s, nor do I think many people will abandon things like Spotify and Amazon Kindle completely. It’s not like I’m some model citizen either: I share a YouTube Premium account because the ads make me want to die, and I will admit having a weakness for the Criterion Channel. But the packrat lifestyle has shown me that other ways are possible, and that at the end of the day, the only things we can trust to always be there are the things we can hold in our hands and copy without restriction.
#404 media#subscription services#silicon valley#surveillance capitalism#digital marketplace#streaming#piracy#enshittification#big tech#technology#napster#drm#spotify#algorithm#music#netflix
7 notes
·
View notes
Text

What we are seeing is a fundamental reset that aims to determine not only our relationship with power but the very nature of our existence and even the right to exist as a human.
Read More: https://thefreethoughtproject.com/technology/hegemony-and-propaganda-love-your-servitude
#TheFreeThoughtProject
5 notes
·
View notes
Text
The technological society… will not be a universal concentration camp, for it will be guilty of no atrocity. It will not seem insane, for everything will be ordered, and the stains of human passion will be lost amid the chromium gleam. We shall have nothing more to lose, and nothing to win. Our deepest instincts and our most secret passions will be analyzed, published, and exploited. We shall be rewarded with everything our hearts ever desired. And the supreme luxury of the society of technical necessity will be to grant the bonus of useless revolt and of an acquiescent smile. (p. 427)
This sharp paragraph was not written in the 1990s, nor in the 2010s, nor in recent years. It appeared in French sociologist and philosopher Jacques Ellul’s treatise The Technological Society, originally published in 1954 and translated into English with Robert Merton’s introduction in 1964. Ellul was concerned with the emergence of a technological tyranny over humanity and the totality of efficiency, and in this classic book he masterly elaborated on these substantial issues. Quite a few claimed 70 years ago and even decades later that Ellul was too provocative, that he overstated and exaggerated. Rereading his significant and penetrating observations in the current age evokes different thoughts and reactions. What is certain is that we are only at the beginning of this age. So is it possible that even we are far from understanding the essence of Ellul’s alerting words?
To reflect more on this crucial topic dive into these insightful books for which Ellul’s The Technological Society laid the groundwork: — Beer, David. 2019. The Social Power of Algorithms. Routledge. — Fisher, Eran. 2022. Algorithms and Subjectivity: The Subversion of Critical Knowledge. Routledge — Fisher, Max. 2022. The Chaos Machine: The Inside Story of How Social Media Rewired Our Minds and Our World. Little, Brown and Company. — Schwarz, Ori. 2021. Sociological Theory for Digital Society: The Codes that Bind us Together. John Wiley & Sons. — Zuboff, Shoshana. 2018. The Age of Surveillance Capitalism. Profile Books
12 notes
·
View notes
Text
Dwelling Beyond Capture – Reclaiming the Texture of Being
In a digital epoch saturated by relentless visibility, the imperative to be seen has become axiomatic. Photo by Pixabay on Pexels.com Algorithms sculpt perception, presence is quantified into reach and engagement, and our subjectivities, once sacred, unruly, and intimate, are now datafied for optimisation, circulation, and extraction. What is lost in this conversion is not merely privacy or…

View On WordPress
#Affect Theory#algorithmic culture#anti-consumerism#Attention Economy#attention politics#capitalism critique#commodification of self#contemporary philosophy#critical phenomenology#digital ontology#digital subjectivity#embodied presence#ethics of care#existential autonomy#meditative resistance#neoliberal subjectivity#ontological resistance#performative society#phenomenology of being#platform capitalism#political economy of visibility#post-capitalist ethics#post-digital critique#presence and refusal#Raffaello Palandri#refusal as praxis#selfhood beyond identity#silence and resistance#surveillance capitalism
2 notes
·
View notes
Text
This succinctly articulates a lot of my feelings about The State Of Social Media
#comic#webcomic#smbc#smbc comic#surveillance capitalism#capitalism#algorithms#instagram#twitter#tiktok
3 notes
·
View notes
Text
you guys have got to get better at talking to strangers
I don't even mean like. being more skilled at it. i mean you have to do it. you have to talk to your cashier, to the old woman you cross on the pavement, to the guy waiting in the queue for the gig in front of you, to the teenager deciding between books at the bookstore.
not all the time; you don't have to constantly be talking. but you have to speak to the people around you i am begging you
#our only port of call for communication cannot be online#it's not just the surveillance state but also the echo chamber it creates for you#and the algorithm choosing who you interact with#community building#organizing#us politics
3 notes
·
View notes
Text
Is Your Phone Listening to You? (Probably.) 🤨
Okay, how many times have you talked about something random, like, say, Victorian doorknobs, and then BAM! Your phone's suddenly showing you ads for antique hardware? 🧐 Yeah, it's creepy AF. But is your phone actually listening to you? Short answer: probably.
The Data Minefield
Look, we all know those "free" apps aren't actually free. You're paying with your data, babe. And that data includes, yep, your voice. Companies use this info to target ads, and while they claim they're not recording your every word, it sure feels that way sometimes, doesn't it?
Fight Back Against the Algorithm
So, what can you do? Well, you can ditch your smartphone and live off the grid (tempting, tbh). Or, you can be a little more mindful of the apps you use, check your privacy settings, and maybe, just maybe, have those weird conversations about Victorian doorknobs in person. Just sayin'. 😉
Want to add some personality to your tech while you ponder the mysteries of the digital age? Head over to my shop for some cool designs that'll make your phone (and you) look fly. 😎
#cozidreamsreimagine#graphic design#digital art#artistsoninstagram#creative entrepreneurship#etsyshop#shop small#small business#millennials#gen z#life hacks#funny#memes#relatable#humor#mindfulness#mental health#digitalwellbeing#onlinesafety#internet culture#social media#algorithms#big tech#surveillance#data#technology#data privacy
2 notes
·
View notes
Text
The Philosophy of Social Media
The philosophy of social media examines the profound impact of social media platforms on human interaction, identity, and society. This interdisciplinary field intersects with ethics, epistemology, sociology, and media studies, exploring how digital technologies shape our communication, perceptions, and behaviors. By analyzing the philosophical implications of social media, we gain insights into the nature of digital life and its influence on contemporary society.
Key Themes in the Philosophy of Social Media
Digital Identity and Self-Presentation:
Social media allows users to construct and curate their online personas, raising questions about authenticity, self-expression, and the nature of identity.
Philosophers explore how the digital environment influences self-perception and the distinction between online and offline selves.
Epistemology and Information:
The spread of information and misinformation on social media platforms presents challenges to traditional epistemology.
Discussions focus on the credibility of sources, the role of algorithms in shaping information, and the impact of echo chambers on knowledge and belief formation.
Ethics of Communication and Behavior:
The ethical implications of online behavior, including issues of privacy, cyberbullying, and digital harassment, are central to this field.
Philosophers examine the moral responsibilities of individuals and platforms in fostering respectful and ethical online interactions.
Social Media and Society:
Social media's role in shaping public discourse, political engagement, and social movements is a significant area of inquiry.
The influence of social media on democracy, public opinion, and collective action is critically analyzed.
Privacy and Surveillance:
The balance between privacy and surveillance on social media platforms raises important ethical and philosophical questions.
The implications of data collection, user tracking, and digital surveillance on personal freedom and autonomy are explored.
The Nature of Virtual Communities:
Social media creates new forms of community and social interaction, prompting philosophical inquiries into the nature and value of virtual communities.
The concepts of digital solidarity, community building, and the social dynamics of online interactions are examined.
Aesthetics of Social Media:
The visual and aesthetic dimensions of social media, including the impact of images, videos, and memes, are considered.
Philosophers analyze how aesthetic choices and digital art forms influence perception and communication in the digital age.
Addiction and Mental Health:
The psychological effects of social media use, including addiction, anxiety, and the impact on mental health, are significant areas of study.
Philosophers explore the ethical considerations of designing platforms that may contribute to addictive behaviors.
Algorithmic Bias and Justice:
The role of algorithms in shaping social media experiences raises questions about bias, fairness, and justice.
Philosophers critically assess the implications of algorithmic decision-making and its impact on social equality and discrimination.
Commercialization and Consumerism:
The commercialization of social media platforms and the commodification of user data are key concerns.
Discussions focus on the ethical implications of targeted advertising, consumer manipulation, and the economic dynamics of social media companies.
The philosophy of social media provides a comprehensive framework for understanding the complexities of digital interaction and its impact on contemporary life. By examining issues of identity, epistemology, ethics, and societal influence, this field offers valuable insights into the ways social media shapes our world. It encourages a critical and reflective approach to digital life, emphasizing the need for ethical considerations and responsible use of technology.
#philosophy#epistemology#knowledge#learning#chatgpt#education#Digital Identity#Social Media Ethics#Online Behavior#Epistemology of Social Media#Privacy and Surveillance#Virtual Communities#Aesthetics of Social Media#Mental Health and Social Media#Algorithmic Justice#Commercialization of Social Media#social media
4 notes
·
View notes
Text
AI and National Security: The New Battlefield
New Post has been published on https://thedigitalinsider.com/ai-and-national-security-the-new-battlefield/
AI and National Security: The New Battlefield
Artificial intelligence is changing how nations protect themselves. It has become essential for cybersecurity, weapon development, border control, and even public discourse. While it offers significant strategic benefits, it also introduces many risks. This article examines how AI is reshaping security, the current outcomes, and the challenging questions these new technologies raise.
Cybersecurity: A Fight of AI against AI
Most present‑day attacks start in cyberspace. Criminals no longer write every phishing email by hand. They use language models to draft messages that sound friendly and natural. In 2024, a gang used a deep-fake video of a chief financial officer stealing 25 million dollars from his own firm. The video looked so real that an employee followed the fake order without a doubt. Attackers now feed large language models with leaked resumes or LinkedIn data to craft personal bait. Some groups are even using generative AI to create software bugs or write malware snippets.
Defenders are also using AI to shield against these attacks. Security teams feed network logs, user clicks, and global threat reports into AI tools. The software learns “normal” activity and warns when something suspicious happens. When an intrusion is detected, AI systems disconnect a suspected computer to limit damage that would spread if humans reacted slower.
Autonomous Weapons
AI also steps onto physical battlefields. In Ukraine, drones use onboard vision to find fuel trucks or radar sites before they explode. The U.S. has used AI to help identify targets for airstrikes in places like Syria. Israel’s army recently used an AI target‑selection platform to sort thousands of aerial images to mark potential militant hideouts. China, Russia, Turkey, and the U.K. have tested “loitering munitions” that circle an area until AI spots a target. These technologies can make military operations more precise and reduce risks for soldiers. But they also bring serious concerns. Who is responsible when an algorithm chooses the wrong target? Some experts fear “flash wars” where machines react too quickly for diplomats to stop them. Many experts are calling for international rules to control autonomous weapons, but states fear falling behind if they pause.
Surveillance and Intelligence
Intelligence services once relied on teams of analysts to read reports or watch video feeds. Today they rely on AI to sift millions of images and messages each hour. In some countries, like China, AI tracks citizens’ behavior, from small things like jaywalking to what they do online. Similarly, on the U.S.–Mexico border, solar towers with cameras and thermal sensors scan empty desert. The AI spots a moving figure, labels it human or animal, then alerts patrol agents. This “virtual wall” covers wide ground that humans could never watch alone.
While these tools extend coverage, they also magnify errors. Face‑recognition systems have been shown to misidentify women and people with darker skin at higher rates than white men. A single false match may cause an innocent person to face extra checks or detention. Policymakers ask for audited algorithms, clear appeal paths, and human review before any strong action.
Information Warfare
Modern conflicts are fought not only with missiles and code but also with narratives. In March 2024 a fake video showed Ukraine’s president ordering soldiers to surrender; it spread online before fact‑checkers debunked it. During the 2023 Israel–Hamas fighting, AI‑generated fakes favoring one side’s policies flooded social streams, in order to tilt opinion.
False information spreads faster than governments can correct it. This is especially problematic during elections, where AI-generated content is often used to sway voters. Voters find it difficult to distinguish between real and AI-generated images or videos. While governments and tech firms are working on counter‑AI projects to scan the digital fingerprints of AI but the race is tight; creators improve their fakes just as fast as defenders improve their filters.
Decision Support
Armies and agencies collect vast amounts of data including hours of drone video, maintenance logs, satellite imagery, and open‑source reports. AI helps by sorting and highlighting relevant information. NATO recently adopted a system inspired by the U.S. Project Maven. It links databases from 30 member states, providing planners with a unified view. The system suggests likely enemy movements and identifies potential supply shortages. The U.S. Special Operations Command uses AI to help draft parts of its annual budget by scanning invoices and recommending reallocations. Similar AI platforms predict engine failures, schedule repairs in advance, and customize flight simulations for individual pilots’ needs.
Law Enforcement and Border Control
Police forces and immigration officers are using AI for tasks that require constant attention. At busy airports, biometric kiosks confirm identities of travelers to make the process more efficient. Pattern-analysis software picks out travel records that hint at human trafficking or drug smuggling. In 2024, one European partnership used such tools to uncover a ring moving migrants through cargo ships. These tools can make borders safer and help catch criminals. But there are concerns too. Facial recognition sometimes fails for certain classes of people with low representation, which could lead to mistakes. Privacy is another issue. The key question is whether AI should be used to monitor everyone so closely.
The Bottom Line
AI is changing national security in many ways, offering both opportunities and risks. It can protect countries from cyber threats, make military operations more precise, and improve decision-making. But it can also spread lies, invade privacy, or make deadly errors. As AI becomes more common in security, we need to find a balance between using its power for good and controlling its dangers. This means countries must work together and set clear rules for how AI can be used. In the end, AI is a tool, and how we use it will redefine the future of security. We must be careful to use it wisely, so it helps us more than it harms us.
#2023#2024#agents#ai#AI Cyber Security#AI in defense strategies#AI in national security#AI platforms#ai surveillance#AI systems#ai tools#ai-generated content#alerts#algorithm#Algorithms#Analysis#Article#artificial#Artificial Intelligence#attackers#attention#autonomous#autonomous weapons#Behavior#biometric#border#borders#bugs#Cameras#China
0 notes
Text
The Line They Keep Crossing
There comes a point in any civilization when a great reckoning emerges—not from the fringes of society, but from the steady erosion of its institutions. In the United States, we are dangerously close to that point now. Google and its many digital tentacles are not just indexing the world’s information—they are editing its memory, curating its rage, and deciding which truths get daylight. And…
#algorithmic bias#corporate overreach#democratic erosion#digital authoritarianism#digital rights#Freedom of Speech#google censorship#legal complicity#media control#moral accountability#political manipulation#pre-revolution warning#shadow banning#surveillance capitalism#tech monopoly
0 notes
Text
Predictive Policing: AI-Driven Crime Forecasting and Its Ethical Dilemmas
Discover how AI-driven crime forecasting in predictive policing is reshaping law enforcement. Explore its ethical dilemmas, including bias, privacy concerns, Predictive policing refers to the use of AI algorithms to forecast potential criminal activities based on historical data. While this approach aims to enhance law enforcement efficiency, it introduces significant ethical concerns. What Is…
#AI Crime Forecasting#Algorithmic Bias#Artificial Intelligence in Law Enforcement#Ethical AI#Ethics and Privacy#Future of Policing#Law Enforcement Innovation#Policing Technology#Predictive Policing#Surveillance and Civil Rights#UK Crime Forecasting#USA Predictive Policing
0 notes
Text
Welcome to "The War You Can't See"
Is Traditional War Dead — or Just Rebranded?
Are we past the era of bombs, boots, and drone strikes? Maybe. But not because we’ve become more peaceful — just more efficient.
Why nuke a city when you can collapse its power grid and watch the lights, water, hospitals, commerce — and society — all flicker offline? Remember when a rogue Windows update brought a third of the world’s industrial systems to a standstill? That wasn’t warfare. That was a preview.
Modern conflict isn’t fought on battlefields. It’s fought in code, supply chains, infrastructure, and perception.
And what about information?
Weaponised misinformation is no longer a Cold War tactic — it’s business-as-usual. Traditional media is saturated with bias, funded by the highest bidder, dressed in headlines that are actually ads. In Australia, entire news segments blur the line between reporting and native advertising. Journalism has become a product — and we’re the ones being sold.
Then there’s the algorithm: our supposed gateway to truth. But whose truth? Curated by whom? Funded by what? When your feed is built by unseen commercial interests, your worldview becomes a subscription model.
Now enter AI — systems trained on the data of our broken information ecosystem.
Can we really expect neutrality from machines built on human bias? Even the purest dataset is tainted by its source — culture, ideology, power.
Should we be able to audit what trains AI? Yes. Will we? That depends on whether transparency serves those in control.
Because we’re not in the Information Age anymore — that’s old tech. We’ve entered the AI Age — a time when digital systems no longer just store knowledge, but shape it, interpret it, and replace our need to think critically at all.
So what comes next?
Does AI fade like 3D TVs and holograms — hyped, then shelved? Or does it become our bridge to something post-terrestrial — guiding a civilisation too damaged to stay on Earth, but too stubborn to end here?
The War You Can’t See
We didn’t end war. We just changed the weapons.
There was a time when war meant bombs and boots. Now? It’s code, chaos, and collapse — executed from behind keyboards, masked as policy or progress.
Modern conflict isn’t about shock and awe. It’s about confusion and collapse.
The Quiet Wars of the 21st Century
Why drop a bomb when you can:
Crash a national power grid remotely
Infect the population with misinformation until truth becomes meaningless
Manipulate markets, supply chains, food systems
Split a society until it implodes on itself
This is asymmetric warfare — and it’s not coming. It’s here.
One rogue Windows update can cripple logistics. One viral deepfake can swing elections. Now imagine that… weaponised, at scale, and on purpose.
No uniforms. No rules of engagement. Just quiet devastation.
Misinformation Is the New Munition
In an attention economy, truth is no longer sacred. It’s a subscription.
What used to be journalism is now:
Corporate content disguised as objectivity
Algorithms that reward outrage, not accuracy
Engagement loops that harden echo chambers
AI doesn’t fix this. It just speeds it up.
Models are trained on human data. Human data is biased. So the system reflects the power structures that shaped it: colonial, capitalist, western, dominant.
Truth becomes a product. And you're the consumer.
The War of Influence
This is the battlefield now:
Infrastructure attacks
Socioeconomic sabotage
Data manipulation
Manufactured division
The soldier is a coder. The battleground is your belief system. The casualty? Consensus reality.
This is 21st-century warfare. You won’t hear the bullets. But you’ll feel the collapse.
Final Thought: We’re All Combatants Now
The age of visible enemies and declared wars is over. What we face now is persistent, ambient conflict — war without warning, borders, or uniforms. The lines between citizen and soldier, truth and tactic, reality and narrative have blurred.
And the most dangerous part? Most people don’t even realise they’re under attack.
This isn’t peace. It’s just war with better PR.
#modern warfare#digital warfare#asymmetric warfare#psychological operations#information war#media manipulation#cyberwarfare#truth decay#socioeconomic warfare#soft power#global instability#algorithmic bias#collapse of consensus#post truth era#propaganda machine#data as a weapon#future conflict#invisible war#surveillance state#digital cynicism
0 notes