upvote
I love this.

Another way of saying it: the problem we should be focused on is not how smart the AI is getting. The problem we should be focused on is how dumb people are getting (or have been for all of eternity) and how they will facilitate and block their own chance of survival.

That seems uniquely human but I'm not a ethnobiologist.

A corollary to that is that the only real chance for survival is that a plurality of humans need to have a baseline of understanding of these threats, or else the dumb majority will enable the entire eradication of humans.

Seems like a variation of Darwin's law, but I always thought that was for single examples. This is applied to the entirety of humanity.

reply
> The problem we should be focused on is how dumb people are getting (or have been for all of eternity)

Over the arc of time, I’m not sure that an accurate characterization is that humans have been getting dumber and dumber. If that were true, we must have been super geniuses 3000 years ago!

I think what is true is that the human condition and age old questions are still with us and we’re still on the path to trying to figure out ourselves and the cosmos.

reply
Totally anecdotal but I think phones have made us less present, or said another way, less capable of using our brains effectively. It isn't exactly dumb but it feels very close.

I definitely think we are smarter if you are using IQ, but are we less reactive and less tribal? I'm not so sure.

reply
Modern dumb people have more ability to affect things. Modern technology, equal rights, voting rights give them access to more control than they've ever had.

That's my theory, anyway.

reply
Majority of us are meme-copying automatons who are easily pwned by LLMs. Few of us have learned to exercise critical thinking and understanding from the first assumptions - the kind of thing we are expected to be learn in schools - also the kind of thing that still separates us from machines. A charitable view is that there is a spectrum in there. Now, with AI and social media, there will be an acceleration of this movement to the stupid end of the spectrum.
reply
> That seems uniquely human but I'm not a ethnobiologist.

In my opinion, this is a uniquely human thing because we're smart enough to develop technologies with planet-level impact, but we aren't smart enough to use them well. Other animals are less intelligent, but for this very reason, they lack the ability to do self-harm on the same scale as we can.

reply
Isn't defining what should not be done by anyone a problem that laws (as in legislation) are for? Though, it's not that I expect that those laws would come in time.
reply
Look, we’ve had nukes for almost 100 years now. Do you really think our ancient alien zookeepers are gonna let us wipe with AI? Semi /j
reply
It's even worse than that.

The positives outcomes are structurally being closed. The race to the bottom means that you can't even profit from it.

Even if you release something that have plenty of positive aspects, it can and is immediately corrupted and turned against you.

At the same time you have created desperate people/companies and given them huge capabilities for very low cost and the necessity to stir things up.

So for every good door that someone open, it pushes ten other companies/people to either open random potentially bad doors or die.

Regulating is also out of the question because otherwise either people who don't respect regulations get ahead or the regulators win and we are under their control.

If you still see some positive door, I don't think sharing them would lead to good outcomes. But at the same time the bad doors are being shared and therefore enjoy network effects. There is some silent threshold which probably has already been crossed, which drastically change the sign of the expected return of the technology.

reply