upvote
I don’t think that’s quite right. I’d say instead that if the singularity does happen, there’s no telling which beliefs will have mattered.
reply
if people believe its a threat and it is also real then what matters is timing
reply
Which would also mean the accelerationists are potentially putting everyone at risk. I'd think a soft takeoff decades in the future would give us a much better chance of building the necessary safeguards and reorganizing society accordingly.
reply
This is a soft takeoff

We, the people actually building it, have been discussing it for decades

I started reading Kurzweil in the early 90s

If you’re not up to speed that’s your fault

reply
Decades from now. Society is nowhere near ready for a singularity. The AI we have now, as far as it has come, is still a tool for humans to use. It's more Augmented Intelligence than AGI.

A hard takeoff would be the tool bootstrapping itself into an autonomous self-improving ASI in a short amount of time.

And I read Kurzweil years ago too. He thought reverse engineering the human brain once the hardware was powerful enough would together give us the singularity in 2045. And the Turing Test would have been passed by 2029, but seems like LLMs have already accomplished this.

reply
Depends on what a post singularity world looks like, with Roko's basilisk and everything.
reply
> If the singularity does happen, then it hardly matters what people do or don't believe.

Depends on how you feel about Roko's basilisk.

reply
God Roko's Basilisk is the most boring AI risk to catch the public consciousness. It's just Pascal's wager all over again, with the exact same rebuttal.
reply
The culture that brought you "speedrunning computer science with JavaScript" and "speedrunning exploitative, extractive capitalism" is back with their new banger "speedrunning philosophy". Nuke it from orbit; save humanity.
reply