upvote
I mean that does partially reduce the chances of a cartel, but not really near as likely as you think.

Most countries have a pretty strong ban on most kinds of weapons, the US is one of the few that lets everyone run around with their rooty tooty point and shooty, but most countries have implemented bans. Some because the government doesn't want the people having them, and in others the citizens call for the bans because they don't like the idea of getting shot by their fellow citizens.

It won't be long before citizens and governments get tired of models being used for criminal activities and will eventually lay down laws around this. Models will have to be registered and safety tested, strict criminal prosecution will happen if you don't. And the big model companies will back their favorite politicians to ensure this will happen to.

Now, that in general will be helpful as there will still be more models, but it will still not be a free for all.

reply
deleted
reply
Uhhh you know that the paperclip problem stems from ai just following a task, not understanding what it is doing? Not from being misalignment.

I would go out a limb and say that current a could create a paperclip problem, given powerful enough tools.

reply
The argument is that it's misaligned because it only values one thing: more paperclips, while human values are much more varied and complex.

Debatable whether it truly understands what it's doing or not, but the argument usually assumes that it does know what it's doing at least in that it's able to imagine outcomes and create plans to reach its singular goal, making it a very simple toy example of a misaligned system.

reply
Well, part of the problem too is there's zero accountability. Who decides what it means to be aligned and how does that evolve over time?

No matter what, common people are quickly losing agency in that discussion.

reply