←back to thread

410 points jjulius | 1 comments | | HN request time: 0s | source
Show context
AlchemistCamp ◴[] No.41889077[source]
The interesting question is how good self-driving has to be before people tolerate it.

It's clear that having half the casualty rate per distance traveled of the median human driver isn't acceptable. How about a quarter? Or a tenth? Accidents caused by human drivers are one of the largest causes of injury and death, but they're not newsworthy the way an accident involving automated driving is. It's all too easy to see a potential future where many people die needlessly because technology that could save lives is regulated into a greatly reduced role.

replies(20): >>41889114 #>>41889120 #>>41889122 #>>41889128 #>>41889176 #>>41889205 #>>41889210 #>>41889249 #>>41889307 #>>41889331 #>>41889686 #>>41889898 #>>41890057 #>>41890101 #>>41890451 #>>41893035 #>>41894281 #>>41894476 #>>41895039 #>>41900280 #
Arainach ◴[] No.41889128[source]
This is about lying to the public and stoking false expectations for years.

If it's "fully self driving" Tesla should be liable for when its vehicles kill people. If it's not fully self driving and Tesla keeps using that name in all its marketing, regardless of any fine print, then Tesla should be liable for people acting as though their cars could FULLY self drive and be sued accordingly.

You don't get to lie just because you're allegedly safer than a human.

replies(4): >>41889149 #>>41889881 #>>41890885 #>>41893587 #
1. awongh ◴[] No.41893587[source]
Also force other auto makers to be liable when their over-tall SUVs cause more deaths than sedan type cars.