Tumgik
#and this is just. the most cliche and predictable and non-radical thing it is possible for me to do because it's THIS character
musical-chick-13 · 1 year
Text
Most of the time I am SO LOUDLY ENTHUSIASTIC about being bisexual (at least, like--in my brain or the places where it’s actually safe to be Out™), but occasionally, very occasionally, like once every other year, the Internalized Biphobia™ just decides to show up and then I feel guilty for being Attracted to a Man, when I am. Literally. Bisexual. With a history. Of being attracted to. People of. All genders.
3 notes · View notes
mugasofer · 3 years
Text
It seems like many, perhaps most, people historically believed in some immanent apocalypse.
Many philosophies claim that the world is passing into a degenerate age of chaos (Ages of Man, Kali Yuga, life-cycle of civilisation), or divine conflict will shortly spill over & destroy the Earth (Ragnorok, Revelations, Zoroastrian Frashokereti), or that the natural forces sustaining us must be transient.
Yet few panic or do anything. What anyone does "do about it" is often symbolic & self-admittedly unlikely to do much.
Maybe humans evolved not to care, to avoid being manipulated?
Many cults make similar claims, and do uproot their lives around them. Even very rarely committing mass suicide or terror attacks etc on occasion. But cults exist that don't make such claims, so it may not be the mechanism they use to control, or at most a minor one. "This is about the fate of the whole world, nothing can be more important than that, so shut up" may work as as a thought terminating cliche, but it doesn't seem to work that strongly, and there are many at least equally effective ones.
Some large scale orgs do exist that seem to take their eschatology "seriously". The Aztecs committed atrocities trying to hold off apocalypse, ISIS trying to cause it. Arguably some Communist or even fascist groups count, depending on your definition of apocalypse.
But even then, one can argue their actions are not radically different from non-apocalypse-motivated ones - e.g. the Aztecs mass-executed less per capita than the UK did at times & some historians view them as more about displaying authority.
I'm thinking about this because of two secular eschatologies - climate apocalypse and the Singularity.
My view on climate change, which as far as I can tell is the scientific consensus, is that it is real and bad but by no means apocalyptic. We're talking incremental increases in storms, droughts, floods etc, all of which are terrible, but none of which remotely threaten human civilisation. E.g. according to the first Google result, the sea is set to rise by 1 decimeter by 2100 in a "high emissions scenario", not to rise by tens or hundreds of meters and consume all coastal nations as I was taught as a child. Some more drastic projections suggest that the sea might rise by as much as two or three meters in the worst case scenario.
It really creeps me out when I hear people who confess to believe that human civilisation, the human species, or even all life on Earth is most likely going to be destroyed soon by climate change. The most recent example, which prompted this post, was the Call of Cthulhu podcast I was listening to casually suggesting that it might be a good idea to summon an Elder God of ice and snow to combat climate change as the "lesser existential risk", perhaps by sacrificing "climate skeptics" to it. It's incredibly jarring for me to realise that the guys I've been listening to casually chatting about RPGs think they live in a world that will shortly be ended by the greed of it's rulers. But this idea is everywhere. Discussions of existential risks from e.g. pandemics inevitably attract people arguing that the real existential risk is climate change. A major anti-global-warming protest movement, Extinction Rebellion, is literally named after the idea that they're fighting against their own extinction. Viral Tumblr posts talk about how the fear of knowing that the world is probably going to be destroyed soon by climate change and fascism is crippling their mental health, and they have no idea how to deal with it because it's all so real.
But it's not. It's not real.
Well, I can't claim that political science is accurate enough for me to definitively say that fascism isn't going to take over, but I can say that climate science is fairly accurate and it predicts that the world is definitely not about to end in fire or in flood.
(There are valid arguments that climate change or other environmental issues might precipitate wars, which could turn apocalyptic due to nuclear weapons; or that we might potentially encounter a black swan event due to our poor understanding of the ecosystem and climate-feedback systems. But these are very different, as they're self-admittedly "just" small risks to the world.)
And I get the impression that a lot of people with more realistic views about climate change deliberately pander to this, deliberately encouraging people to believe that they're going to die because it puts them on the "right side of the issue". The MCU's Loki, for instance, recently casually brought up a "climate apocalypse" in 2050, which many viewers took as meaning the world ending. Technically, the show uses a broad definition of "apocalypse" - Pompeii is given as another example - and it kind of seems like maybe all they meant was natural disasters encouraged by climate change, totally defensible. But I still felt kinda mad about it, that they're deliberately pandering to an idea which they hopefully know is false and which is causing incredible anxiety in people. I remember when Greta Thurnberg was a big deal, I read through her speeches to Extinction Rebellion, and if you parsed them closely it seemed like she actually did have a somewhat realistic understanding of what climate change is. But she would never come out and say it, it was all vague implications of doom, which she was happily giving to a rally called "Extinction Rebellion" filled with speakers who were explicitly stating, not just coyly implying, that this was a fight for humanity's survival against all the great powers of the world.
But maybe there's nothing wrong with that. I despise lying, but as I've been rambling about, this is a very common lie that most people somehow seem unaffected by. Maybe the viral tumblr posts are wrong about the source of their anxiety; maybe it's internal/neurochemical and they world just have picked some other topic to project their anxieties on if this particular apocalypse wasn't available. Maybe this isn't a particularly harmful lie, and it's hypocritical of me to be shocked by those who believe it.
Incidentally, I believe the world is probably going to end within the next fifty years.
Intellectually, I find the arguments that superhuman AI will destroy the world pretty undeniable. Sure, forecasting the path of future technology is inherently unreliable. But the existence of human brains, some of which are quite smart, proves pretty conclusively it's possible to get lumps of matter to think - and human brains are designed to run on the tiny amounts of energy they can get by scavenging plants and the occasional scraps of meat in the wilderness as fuel, with chemical signals that propagate at around the speed of sound (much slower than electronic ones), with only the data they can get from input devices they carry around with them, and which break down irrevocably after a few decades. And while we cannot necessarily extrapolate from the history of progress in both computer hardware and AI, that progress is incredibly impressive, and there's no particular reason to believe it will fortuitously stop right before we manufacture enough rope to hang ourselves.
Right now, at time of writing, we have neural nets that can write basic code, appear to scale linearly in effectiveness with the available hardware with no signs that we're reaching their limit, and have not yet been applied at the current limits of available hardware let alone what will be available in a few years. They absorb information like a sponge at a vastly superhuman speed and scale, allowing them to be trained in days or hours rather than the years or decades humans require. They are already human-level or massively superhuman at many tasks, and are capable of many things I would have confidently told you a few years ago were probably impossible without human-level intelligence, like the crazy shit AI dungeon is capable of. People are actively working on scaling them up so that they can work on and improve the sort of code they are made from. And we have no ability to tell what they're thinking or control them without a ton of trial and error.
If you follow this blog, you're probably familiar with all the above arguments for why we're probably very close to getting clobbered by superhuman AI, and many more, as well as all the standard counter-arguments and the counter-arguments to those counter arguments.
(Note: I do take some comfort in God, but even if my faith were so rock solid that I would cheerfully bet the world on it - which it's not - there's no real reason why our purpose in God's plan couldn't be to destroy ourselves or be destroyed as an object lesson to some other, more important civilization. There's ample precedent.)
Here's the thing: I'm not doing anything about it, unless you count occasionally, casually talking about it with people online. I'm not even donating to help any of the terrifyingly-few people who are trying to do something about it. Part of why I'm not contributing is, frankly, I don't have a clue what to do, nor do I have much confidence in any of the stuff people are currently doing (although I bloody well hope some of it works.)
And yet I don't actually feel that scared.
I feel more of a visceral chill reading about the nuclear close calls that almost destroyed the world in the recent past than thinking about the stuff that has a serious chance of doing so in a few decades. I'm a neurotic mess, and yet what is objectively the most terrifying thing on my radar does not actually seem to contribute to my neurosis.
21 notes · View notes