Tumgik
#epistemic status: a little contrarian but also serious
Text
My most significant reaction to this essay is to think that “create a haven for outcasts and a paradise for bohemians, with lots of warm connections of mutual support and fun between people who don’t fit in with broader society” is a much better goal than “save the world.”
“Create a haven for outcasts and a paradise for bohemians, with lots of warm connections of mutual support and fun between people who don’t fit in with broader society” is a goal that a small number of people with moderate access to resources can actually achieve. You just need to find these people and convince them to associate with each other and support each other, and the infrastructure for that is pretty cheap in the internet age, and the rationalist subculture already has the infrastructure for that. As a form of altruism, I think it’s pretty cost-effective. A person’s whole life can often be turned in a better direction with as little as a few thousand dollars. A single moderately affluent person can make an enormous difference to the lives of five or six or a dozen other people this way. I guess it’s not as efficient as mosquito net charities or something like that, but I think it’s probably much better than, say, contributing money to Friendly AI research (I suspect that by the time we actually build general AI any Friendly AI research we cook up now will be at best historically significant philosophical texts that introduced some important general concepts but are hopelessly outdated when they discuss the meat and potatoes of programming, at worst the equivalent of Medieval physicians writing elaborate treatises about the subtleties of the four humors theory of medicine).
I think “create a mutual support community for eccentric nerds” is also likely to be a very satisfying form of altruism (which, among other things, means less risk of burn-out). You’re helping “your people,” and every day you’ll interact with people whose lives have been obviously and tangibly bettered by your actions. Your altruism will be part of a network of mutual support and “thick” social connections between people who like each other and feel personal loyalty to each other, which is a very time-tested and successful formula (it’s a strategy that has a lot of problems too, sure, but it has some actually very nice features).
By contrast, “save the world” is... probably beyond the power of a small group of people with moderate resources, even if that’s a group that trends more affluent and smarter than average. I mean, taken at face value, “save the world” is a problem that has absorbed a significant fraction of humanity’s total intelligence and resources for literally centuries. Big concrete steps toward “saving the world” (such as, say, inventing commercially viable fusion reactors) would probably require an effort comparable to the Manhattan Project or bigger in many cases. Even “modest” steps in the “save the world” direction (such as, say, a coronavirus vaccine or a new hydroelectric dam) tend to be quite skill-intensive and technology-intensive and/or resource-intensive.
Compared to the status quo of 1800 C.E. the world has been saved, but it was mostly done by a strategy that looked very different from the high rationalist “get a small number of smart people together with the explicit goal of saving the world” strategy. The actual historically successful world-saving strategy was a huge number of people working to solve a huge number of more specific problems that were only tangentially related to saving the world. “Get a relatively small number of smart enlightened people with the right mindset together and set them to work with the explicit goal of saving the world” reminds me more of science fiction, specifically of Isaac Asimov’s concept of the Foundation, and I honestly wonder if that’s mostly where the idea came from (I observe that “member of the Foundation” is an aspirational social role that’s very ego-pleasing and seductive to a book-smart “gifted” eccentric who doesn’t fit in well with mainstream society; y’know, the kind of person likely to be attracted to science fiction and rationalist subculture).
For a small number of people with moderate access to resources, “Create a haven for outcasts and a paradise for bohemians, with lots of warm connections of mutual support and fun between people who don’t fit in with broader society” is an easily achievable goal. For a small number of people with moderate access to resources, “save the world” is at best going to inspire a lot of useful efforts that contribute to that project, but at worst it’ll be like twenty people trying to level a mountain with hand tools; a lot of effort expended accomplishing very little (I think the big danger for such a group is getting seduced into pouring their efforts into a glamorous and seductive but probably futile moon-shot; something like “instead of this hard grinding work of actually saving the world, maybe we can just build a friendly super-AI that will do it for us!”).
Also, I think “save the world” is a community recruitment pitch that’s likely to disproportionately attract people who are scrupulous-in-the-badbrains-sense, fanatical, narcissistic, or some combination of those things. Frank Herbert once made this observation about politics:
“All governments suffer a recurring problem: Power attracts pathological personalities. It is not that power corrupts but that it is magnetic to the corruptible.” - Frank Herbert, Chapterhouse: Dune.
I think Frank Herbert identified a real recurring problem with managerial institutions here. Politicians and bosses are high on my list of people I’d be a little wary of dating or being friends with, because I strongly suspect that becoming a politician or a boss really does select for narcissism and authoritarian tendencies. I think you almost have to be a little narcissistic or have significant authoritarian tendencies to look around and think “the world would be better if other people were forced to obey me!” I think this effect contributes to many of the institutional and social pathologies of governments, businesses, and other hierarchical organizations. Likewise, I think you almost have to be at least little narcissistic to hear the high rationalist recruitment pitch of “we’re assembling a team of smart enlightened people who are exceptionally suited to saving the world, so we can save the world!” and think “I’m a smart enlightened person who’s exceptionally suited to saving the world, I belong on that team!” Please don’t feel too personally attacked if that description sounds like an uncharitable description of you, being a little narcissistic is a common personality trait, and part of the reason for that is you probably actually are better than average at the things you like and are good at … but I’d be wary of joining a community with a recruitment pitch that selects for narcissism and fanaticism, and I’d be wary of making such a recruitment pitch for a community I belong to.
I will note that, as an unquantified System 1 impression, I feel there’s a very cloudy but noticeable correlation between how close a piece of writing is to high rationalism and how much it’s infused with a miasma of a lot of the things I find most off-putting about the rationalist subculture; elitist contempt for ordinary people, very self-important vision of rationalist subculture’s role, cult of smartness attitudes, mix of vaguely Randian, Nietzschean, and Protestant work ethic attitudes, libertarian-adjacent politics that implicitly reflects the biases and class interests of affluent tech-company workers.
15 notes · View notes