You mean deaths to multiple other people, do you not? Let's just call a spade a spade here and point out the genuine ethical dilemma.
What's the ratio between "bodies of your own kids" and "other human bodies you have no other connection with" in terms of what a "proper" AI that is controlling a car YOU purchased, should be willing to make in trade in terms of injury or death?
I think most people would argue that it's greater than 1* (unless you are a pure rationalist, in which case, I tip my hat to you), but what "SHOULD" it be?
*meaning, in the case of a ratio of 2 for example, you would require 2 nonfamiliar deaths to justify losing one of your own kids
I would suggest that all but the most narcissistic would have some limit to how many pedestrians they would be willing to run over to save their own lives. The demand that the AI have no such limit—“that the AI will prioritize my life and safety over literally any other concern”—is grotesque.
I mean deaths the AI predicts for other people, yes
And I'm not saying I would never choose to kill myself over killing a schoolbus full of children, but I'll be damned if a computer will make that choice for me.
You can't get into a trolley situation without driving unsafely for the conditions first, so companies focus on preventing that earlier issue.
Isn’t this entirely hypothetical? In reality, are any systems doing this calculus? Or are they mimicking humans, avoiding obstacles and reducing energies in a series of rapid-fire calls?
There's plenty we could talk about: i.e. the failure scenarios of shallow reasoning systems, the serious limitations on the resolution and capability of the actual Tesla cameras used for navigation, the failure modes of LIDAR etc.
Instead we got "what if the car calculates the trolley problem against you?"
And observationally, proof a staggering number of people don't know their road rules (since every variant of it consists of concocting some scenario where slamming on the brakes is done at far too late but you somehow know perfectly well there's not a preschool behind the nearest brick wall or something).
I remember running some basic numbers on this in an argument and you basically wind up at, assuming an AI is fast enough to detect a situation, it's sufficiently fast that it would literally always be able to stop the car with the brakes, or no level of aggressive manoeuvring would avoid the collision.
Which is of course what the road rules are: you slam on the brakes. Every other option is worse and gets even worse when an AI can brake quicker and faster if its smart enough to even consider other options.
Yeah, there are a shocking number of accidents which basically amount to "they tried to swerve and it went badly".
You can concoct a few scenarios where other drivers are violating the road rules so much as to basically be trying to murder you -- the simplest example is "you are stopped at a light and a giant truck is barreling towards you too fast to stop".
If you are a normal driver, you probably learn about this when you wake up in the hospital, but an autonomous vehicle could be watching how fast vehicles are approaching from behind you. There's going to be a wide range of scenarios where it will be clear the truck is not going to stop but there's still time to do something (for instance, a truck going 65mph takes around 5 seconds to stop, so if it's halfway through its stopping distance, you've got around 2.5 seconds to maneuver out of the way).
That does leave you all sorts of room to come up with realistic trolley problems.
But all require a human (or malicious) driver on one hand. The more rule-following AVs on the road, the fewer the opportunities for such trolley problems.
And I'd still argue that debating these ex ante is, while philosophically fascinating, not a practical discussion. I'm not seeing a case where one would code anything further than collision avoidance and e.g. pre-activating restraints.
The typical human preference WRT the trolly problem ("don't take an action which leads to deaths, even if it would save more lives") is also a reasonable -- maybe the only reasonable answer -- to these hypotheticals.
Ie, move against the light to avoid getting rear ended, but not if you're going to run over a pedestrian or cause an accident with another vehicle trying to do so. (Even if getting rear ended would push you into the pedestrian or other car.)
What is the lowest likelihood of your own death you'd find acceptable in this situation?
This is a fair concern. I’m unconvinced it’s even remotely a real market or political pressure.
On the market side, Waymo is constrained by some combination of production and auxiliaries. (Tesla, by technology.) On the political side, the salient debate is around jobs, in large part because Waymo has put to bed many of the practical safety questions from a best-in-class perspective.
I'm not really thinking about when self driving is State of the Art Research. I'm talking about when it becomes table stakes.
Honestly the real truth is I just do not trust tech companies to make decisions that are remotely in my best interest anymore.
I can't even trust tech companies to build software that respects a "do not send me marketing emails" checkbox, why would I ever trust a car driven by software built by the same sort of asshole?
Idk, we solve it then. Motor vehicles kill 40,000 Americans a year [1]. I’m willing to cautiously align with Google and maybe even Tesla if they can take a bit out of those numbers.
"Prioritizing my life over every other concern" looks like plowing over pedestrians to get me to the hospital. I dont think you can legally sell a product that promises that.
As for me I actually like driving and I'm good at it. I'm not afraid of operating my own vehicle like so many people seem to be
Replacing bad other drivers with good autonomous systems is likely a great trade off for you, even if you are in an autonomous vehicle that is eager to sacrifice you if there is an unavoidable incident.
You just said that you do not care how many people you kill - regardless of whether they are pedestrians, whether they are driving cars or whether they are on the bus. That is what people react to.