upvote
People saw Black Mirror and made a business plan out of it.

https://en.wikipedia.org/wiki/Metalhead_(Black_Mirror)

reply
Also this shortfilm SlaughterBots from 2019 https://youtu.be/O-2tpwW0kmU?is=F7RNLXcVuLA5A_lA
reply
It’s been a part of sci-fi for a long time.
reply
It's going to happen and at some level I'd rather war casualties were measured in robots rather than people.

My concern is the cottage industry of integrating guns with half baked AI at the lowest cost. And probably vibe coded too.

The companies don't care - a sale is a sale. MoD maybe doesn't care - 90% accuracy and less human casualties on their own side are a win. Governments want to save money and by the time they find out the robots go rogue, it will be too late to do anything about it.

reply
I can't wait for the day that killing a human-any human-is considered a war crime.
reply
The problem is always the same. It's not just MoD (is it MoW now?) that will have access to this.

YoloV8 + optical flow works fine on an esp32. You want to give a drone rough coordinates for a refinery and hit something in it, like a storage tank? That'll work. This means, give it 5 years, relatively small groups will have access to it. This cannot be stopped.

The only real answer is to work to have groups that you can trust to have access to this first.

reply
Sadly, building an AI that analyses camera imagery and aims at humans, from scratch, is these days almost an intern project. It's not really something you can control or ban, the way you can control, dunno, uranium enrichment.

Integrating it with a robot and sticking a gun on it, thankfully, requires a bit more know-how.

reply
> The only real answer is to work to have groups that you can trust to have access to this first.

How will this help exactly?

reply
Friendly fire is going to get crazy. Can’t trust an LLM on its own for more than a few iterations..
reply
Don’t worry, it will auto compact its context.
reply
I can't wait for the Faro Plague and the robot dinosaurs.
reply