Then, on the way home it drove me home on the wrong side of the street and I had to take over. Such a silly mistake.
Similar to what you said; from there on out, it was more trouble than it's worth because you can't let your guard down.
FWIW: My 2026 Huyndai's driver assistance is better than my old 2018 Tesla Model 3's enhanced autopilot.
I find it less cognitive load to drive it myself. It's easier to predict what other vehicles will do than my own. Boo.
I have sympathy with the challenges as I worked in the field.
It always had the feeling of being outside with your toddler by the pool. I can look away but I have 50/50 odds of a dead toddler if I do it for to long.
Their role is to stop the train in an emergency and adjust to speed etc. to track/driving conditions.
Automating their job probably wouldn't even need the complex ML used for self-driving because the context is significantly simpler and relatively well defined. Maybe a team in city might need such a model but it would still be a significantly simpler task than driving a car.
Triggering psychosis is not difficult and the LLM is easily capable of doing that. For a person they soon get freaked out and are likely to summon help. "Johnny started acting crazy and I'm not sure what to do, please come". But the LLM isn't a person, Johnny needs to know more about the CIA's programme to cross breed Venusians with Hollywood stars? Here's an itinerary with the address of a real hotel in LA and an entirely hallucinated CIA officer's schedule.
Next thing you know, Johnny is shot dead by officers responding to a maniac with a fire axe who broke into an LA hotel and was screaming about space aliens.
I’m pretty sure LAPD is too used to this sort of thing to get spooked by it?
I do not want to be the 'manager of my car'. That'd be a downgrade from being an actual driver.
Lane Assist, auto-stop-start, cruise control are enough for me and have been available mostly for decades and require a similar amount of attention.
FSD is a busted flush and I can't believe those who got conned by it aren't more vocal.
If you’re driving, your brain can automatically prioritize the importance of things that you see. But since a computer fails in different ways than a human, you lose all automatic prioritization
One such study is "Performance consequences of automation-induced 'complacency'" (Parasuraman, Molloy & Singh, 1993) https://www.pacdeff.com/pdfs/Automation%20Induced%20Complace...
Previous studies had found that a human and a computer performed markedly better than either a human alone or a computer alone - but in those studies failures were quite common, so they didn't give the humans time to get bored or distracted.
When researchers got test subjects to perform a simulated flying task, monitoring a system with 99%+ reliability, they found the humans were proportionally much worse at stepping in than they were on less reliable systems.
Swimming pool lifeguards will often change posts every 15-20 minutes and and get a 10-15 minute break every hour, to keep things interesting enough that they can pay attention. Good luck getting drivers to do that.
Funny, I was going to mention exactly that. I'm a private pilot with a modern autopilot and flying is exhausting. Partly because the piston engine is rattling your brain the entire time but also because you're on high alert the entire time. You're always making sure the autopilot is keeping the plane on the blue (or green) line and is being predictable. And my smartwatch shows my heart rate is usually more elevated on autopilot than not.
A "self-driving" tesla is an adversary you need to supervise to make sure it doesn't take actions you wouldn't expect of a normal car.
As other posters have pointed out, it's like running an LLM with `--dangerously-skip-permissions`: I wouldn't `rm -rf /` my computer (or in the case of tesla, my life), but an AI might.