←back to thread

410 points jjulius | 1 comments | | HN request time: 0.603s | source
Show context
AlchemistCamp ◴[] No.41889077[source]
The interesting question is how good self-driving has to be before people tolerate it.

It's clear that having half the casualty rate per distance traveled of the median human driver isn't acceptable. How about a quarter? Or a tenth? Accidents caused by human drivers are one of the largest causes of injury and death, but they're not newsworthy the way an accident involving automated driving is. It's all too easy to see a potential future where many people die needlessly because technology that could save lives is regulated into a greatly reduced role.

replies(20): >>41889114 #>>41889120 #>>41889122 #>>41889128 #>>41889176 #>>41889205 #>>41889210 #>>41889249 #>>41889307 #>>41889331 #>>41889686 #>>41889898 #>>41890057 #>>41890101 #>>41890451 #>>41893035 #>>41894281 #>>41894476 #>>41895039 #>>41900280 #
Arainach ◴[] No.41889128[source]
This is about lying to the public and stoking false expectations for years.

If it's "fully self driving" Tesla should be liable for when its vehicles kill people. If it's not fully self driving and Tesla keeps using that name in all its marketing, regardless of any fine print, then Tesla should be liable for people acting as though their cars could FULLY self drive and be sued accordingly.

You don't get to lie just because you're allegedly safer than a human.

replies(4): >>41889149 #>>41889881 #>>41890885 #>>41893587 #
SoftTalker ◴[] No.41889881[source]
It’s your car, so ultimately the liability is yours. That’s why you have insurance. If Tesla retains ownership, and just lets you drive it, then they have (more) liability.
replies(1): >>41894328 #
kelnos ◴[] No.41894328[source]
> It’s your car, so ultimately the liability is yours

No, that's not how it works. The driver and the driver's insurer are on the hook when something bad happens. The owner is not, except when the owner is also the one driving, or if the owner has been negligent with maintenance, and the crash was caused by mechanical failure related to that negligence.

If someone else is driving my car and I'm a passenger, and they hurt someone with it, the driver is liable, not me. If that "someone else" is a piece of software, and that piece of software has been licensed/certified/whatever to drive a car, why should I be liable for its failures? That piece of software needs to be insured, certainly. It doesn't matter if I'm required to insure it, or if the manufacturer is required to insure it.

Tesla FSD doesn't fit into this scenario because it's not the driver. You are still the driver when you engage FSD, because despite its name, FSD is not capable of filling that role.

replies(1): >>41903503 #
1. SoftTalker ◴[] No.41903503[source]
Incorrect. Or at least, it varies by state. I was visiting my mother and borrowed her car, had a minor accident with it. Her insurance paid, not mine.

This is why you are required to have insurance for the cars you own. You may from time to time be driving cars you do not own, and the owners of those cars are required to have insurance for those cars, not you.