It's because few realize how downstream most of this AI industry is of Thiel, Eliezer Yudkowsky and LessWrong.com.
Early "rationalist" community was concerned with AI in this way 20 years ago. Eliezer inspired and introduced the founders of Google DeepMind to Peter Thiel to get their funding. Altman acknowledged how influential Eliezer was by saying how he is most deserving of a Nobel Peace prize when AGI goes well (by lesswrong / "rationalist" discussion prompting OpenAI). Anthropic was a more X-risk concerned fork of OpenAI. Paul Christiano inventor of RLHF was big lesswrong member. AI 2027 is an ex-OpenAI lesswrong contributor and Scott Alexander, a centerpiece of lesswrong / "rationalism". Dario, Anthropic CEO, sister is married to Holden Karnofsky, a centerpiece of effective altruism, itself a branch of lesswrong / "rationalism". The origin of all this was directionally correct, but there was enough power, $, and "it's inevitable" to temporarily blind smart people for long enough.
I really recommend “More Everything Forever” by Adam Becker. The book does a really good job laying out the arguments for AI doom, EA, accelerationism, and affiliated movements, including an interview with Yudkowsky, then debunking them. But it really opened my eyes to how… bizarre? eccentric? unbelievable? this whole industry is. I’ve been in tech for over a decade but don’t live in the bay, and some of the stuff these people believe, or at least say they believe, is truly nuts. I don’t know how else to describe it.
Yeah, it's a pretty blatant cult masquerading as a consensus - but they're all singing from the same hymn sheet in lieu of any actual evidence to support their claims. A lot of it is heavily quasi-religious and falls apart under examination from external perspectives.
We're gonna die, but it's not going to be AI that does it: it'll be the oceans boiling and C3 carbon fixation flatlining that does it.
> Anthropic was a more X-risk concerned fork of OpenAI.
What is XRisk? I would have inductively thought adult but that doesn't sound right.
It is very weird to wonder, what if they're all wrong. Sam Bankman-Fried was clearly as committed to these ideas, and crashed his company into the ground.
But clearly if out of context someone said something like this:
"Clearly, the most obvious effect will be to greatly increase economic growth. The pace of advances in scientific research, biomedical innovation, manufacturing, supply chains, the efficiency of the financial system, and much more are almost guaranteed to lead to a much faster rate of economic growth. In Machines of Loving Grace, I suggest that a 10–20% sustained annual GDP growth rate may be possible."
I'd say that they were a snake oil salesman. All of my life experience says that there's no good reason to believe Dario's predictions here, but I'm taken in just as much as everyone else.