I’ve been thinking about the Yudkowsky doomsday cult recently. It reminds me of the Heaven’s Gate cult. Yud is the AI version of Marshall Applewhite.
Applewhite’s thing was this: Earth is about to be “spaded under” by aliens that exist on “the evolutionary level above human.” When the mothership comes, there will be an opportunity to join them at the higher evolutionary level, where you will be provided with a new physical body. Applewhite was himself a higher being and was put on earth to show the path to ascension.
(In his case, the way you ascend is through mass suicide at the right time, in the right place).
Let me translate this into the Yud version:
Humanity is about to go extinct from AI. The AI will be superintelligent (“the evolutionary level above human"), and there’s a limited window of time to avoid the catastrophe. Yudkowsky himself is at a higher level of being (not superintelligent, but one of the smartest humans ever) and is providing us a path to salvation. If we don’t follow his path, there’s a 99.9% chance the AI will exterminate us all.
At the present moment, it looks like his path to salvation is through the government—that there needs to be international control placed on GPUs and computation, under threats of violence. Right now, he’s merely calling for global totalitarianism, but I would not be surprised, at all, if this transitions to a suicide cult.
He apparently wrote a long article, deliberately posted on April 1st, about transitioning to “dying with dignity” in the face of failure to align the AI properly.
At the end of the article, he writes a Q and A with himself that appears to explain his motivation:
Q: Hey, this was posted on April 1st. All of this is just an April Fool's joke, right?
A: Why, of course! Or rather, it's a preview of what might be needful to say later, if matters really do get that desperate. You don't want to drop that on people suddenly and with no warning.
Given his other public statements about impending death, I don’t think this was a satirical article. I think he’s intentionally saying: if humanity doesn’t follow my path, we will all be spaded under by the AI/aliens. Providing us with this information on April 1st is a kind gesture, to gently expose us to a terrible truth that humanity isn’t prepared for yet.
I haven’t read enough of his content to know whether he’s a trans-humanist, but I know a lot of people in his circle are—which perfectly mirrors Applewhite’s idea of receiving a new physical body from the higher beings. “Receive a new body from the alien gods” is the same as “merge your body with the machine gods."
I’m sure others have beaten me to this conclusion, but I think there’s a real chance that Yudkowsky will organize some kind of group suicide. In fact, I would say the odds are significantly higher than the odds of the AI exterminating us all. I’d even be willing to place a small bet, just for fun. $5 says that he’ll be calling for some kind of mass suicide before the end of this decade.
(And in case anybody needs to read this: if the time ever comes, do not “die with dignity” with the great Yudkowsky.)
(From Twitter)
Universal politeness is a problem that makes the world worse.
Most people have a difficult time judging ideas, so they end up going along with whatever social framework the ideas are presented within. What ideas they take seriously is a social, not intellectual phenomenon. Therefore, what the "smart people" seem to respect gains immediate respect. The framework determines the discussion.
The problem arises when terrible ideas, presented by bad thinkers (or disingenuous ones) are treated with undeserved respect. It's like going along with a lie because you don't want to be rude. This happens all the time outside the world of ideas, and it's why we get "respected" martial artists who are incompetent, teaching other people to be incompetent.
In the worst cases, it's how you end up with priests abusing people for decades. Nobody wants to speak up, be rude, and risk rocking the boat.
I see it right now with the Rationalist AI Exterminationists. All the signs of a cult, including impending apocalypse and polycules.
Polite engagement with the ideas of the cult leader is not what's needed. In fact, if this is actually a cult, such behavior makes the world worse. What's needed is somebody saying "Wake up, dummies. You're trapped in a cult. The signs are there."
The correct response to Applewhite is not, "But what if the Earth *is* about to be spaded over by aliens?! We only get one shot at this..."
It's, "This guy has signs of mental illness and looks to be building a cult." If that's rude, then I'll be the rude one, and you can thank me in a few years.
I just saw you say this on a tweet
"Polite engagement with the ideas of the cult leader is not what's needed. In fact, if this is actually a cult, such behavior makes the world worse. What's needed is somebody saying 'Wake up, dummies. You're trapped in a cult. The signs are there.'"
I guess that answers my question on whether you plan to refute the rationale behind their predictions. To me this sounds anti-intellectual. It sounds a lot like what book burners would say, or progressive university students would say to justify banning people from speaking in schools. They also think to even engage with certain ideas would make the world worse.