Most active commenters

    ←back to thread

    410 points jjulius | 21 comments | | HN request time: 0.021s | source | bottom
    Show context
    AlchemistCamp ◴[] No.41889077[source]
    The interesting question is how good self-driving has to be before people tolerate it.

    It's clear that having half the casualty rate per distance traveled of the median human driver isn't acceptable. How about a quarter? Or a tenth? Accidents caused by human drivers are one of the largest causes of injury and death, but they're not newsworthy the way an accident involving automated driving is. It's all too easy to see a potential future where many people die needlessly because technology that could save lives is regulated into a greatly reduced role.

    replies(20): >>41889114 #>>41889120 #>>41889122 #>>41889128 #>>41889176 #>>41889205 #>>41889210 #>>41889249 #>>41889307 #>>41889331 #>>41889686 #>>41889898 #>>41890057 #>>41890101 #>>41890451 #>>41893035 #>>41894281 #>>41894476 #>>41895039 #>>41900280 #
    1. Arainach ◴[] No.41889128[source]
    This is about lying to the public and stoking false expectations for years.

    If it's "fully self driving" Tesla should be liable for when its vehicles kill people. If it's not fully self driving and Tesla keeps using that name in all its marketing, regardless of any fine print, then Tesla should be liable for people acting as though their cars could FULLY self drive and be sued accordingly.

    You don't get to lie just because you're allegedly safer than a human.

    replies(4): >>41889149 #>>41889881 #>>41890885 #>>41893587 #
    2. jeremyjh ◴[] No.41889149[source]
    I think this is the answer: the company takes on full liability. If a Tesla is Fully Self Driving then Tesla is driving it. The insurance market will ensure that dodgy software/hardware developers exit the industry.
    replies(4): >>41889184 #>>41890181 #>>41890189 #>>41890241 #
    3. blagie ◴[] No.41889184[source]
    This is very much what I would like to see.

    The price of insurance is baked into the price of a car. If the car is as safe as I am, I pay the same price in the end. If it's safer, I pay less.

    From my perspective:

    1) I would *much* rather have Honda kill someone than myself. If I killed someone, the psychological impact on myself would be horrible. In the city I live in, I dread ageing; as my reflexes get slower, I'm more and more likely to kill someone.

    2) As a pedestrian, most of the risk seems to come from outliers -- people who drive hyper-aggressively. Replacing all cars with a median driver would make me much safer (and traffic, much more predictable).

    If we want safer cars, we can simply raise insurance payouts, and vice-versa. The market works everything else out.

    But my stress levels go way down, whether in a car, on a bike, or on foot.

    replies(1): >>41889228 #
    4. gambiting ◴[] No.41889228{3}[source]
    >> I would much rather have Honda kill someone than myself. If I killed someone, the psychological impact on myself would be horrible.

    Except that we know that it doesn't work like that. Train drivers are ridden with extreme guilt every time "their" train runs over someone, even though they know that logically there was absolutely nothing they could have done to prevent it. Don't see why it would be any different here.

    >>If we want safer cars, we can simply raise insurance payouts, and vice-versa

    In what way? In the EU the minimum covered amount for any car insurance is 5 million euro, it has had no impact on the safety of cars. And of course the recent increase in payouts(due to the general increase in labour and parts cost) has led to a dramatic increase in insurance premiums which in turn has lead to a drastic increase in the number of people driving without insurance. So now that needs increased policing and enforcement, which we pay for through taxes. So no, market doesn't "work everything out".

    replies(2): >>41890554 #>>41894294 #
    5. SoftTalker ◴[] No.41889881[source]
    It’s your car, so ultimately the liability is yours. That’s why you have insurance. If Tesla retains ownership, and just lets you drive it, then they have (more) liability.
    replies(1): >>41894328 #
    6. tensor ◴[] No.41890181[source]
    I’m for this as long as the company also takes on liability for human errors they could prevent. I’d want to see cars enforcing speed limits and similar things. Humans are too dangerous to drive.
    7. stormfather ◴[] No.41890189[source]
    That would be good because it would incentivize all FSD cars communicating with each other. Imagine how safe driving would be if they are all broadcasting their speed and position to each other. And each vehicle sending/receiving gets cheaper insurance.
    replies(2): >>41890733 #>>41899496 #
    8. KoolKat23 ◴[] No.41890241[source]
    That's just reducing the value of a life to a number. It can be gamed to a situation where it's just more profitable to mow down people.

    What's an acceptable number/financial cost is also just an indirect approximated way of implementing a more direct/scientific regulation. Not everything needs to be reduced to money.

    replies(1): >>41890691 #
    9. blagie ◴[] No.41890554{4}[source]
    > Except that we know that it doesn't work like that. Train drivers are ridden with extreme guilt every time "their" train runs over someone, even though they know that logically there was absolutely nothing they could have done to prevent it. Don't see why it would be any different here.

    It's not binary. Someone dying -- even with no involvement -- can be traumatic. I've been in a position where I could have taken actions to prevent someone from being harmed. Rationally not my fault, but in retrospect, I can describe the exact set of steps needed to prevent it. I feel guilty about it, even though I know rationally it's not my fault (there's no way I could have known ahead of time).

    However, it's a manageable guilt. I don't think it would be if I knew rationally that it was my fault.

    > So no, market doesn't "work everything out".

    Whether or not a market works things out depends on issues like transparency and information. Parties will offload costs wherever possible. In the model you gave, there is no direct cost to a car maker making less safe cars or vice-versa. It assumes the car buyer will even look at insurance premiums, and a whole chain of events beyond that.

    That's different if it's the same party making cars, paying money, and doing so at scale.

    If Tesla pays for everyone damaged in any accident a Tesla car has, then Tesla has a very, very strong incentive to make safe cars to whatever optimum is set by the damages. Scales are big enough -- millions of cars and billions of dollars -- where Tesla can afford to hire actuaries and a team of analysts to make sure they're at the optimum.

    As an individual car buyer, I have no chance of doing that.

    Ergo, in one case, the market will work it out. In the other, it won't.

    10. jeremyjh ◴[] No.41890691{3}[source]
    There is no way to game it successfully; if your insurance costs are much higher than your competitors you will lose in the long run. That doesn’t mean there can’t be other penalties when there is gross negligence.
    replies(1): >>41891750 #
    11. Terr_ ◴[] No.41890733{3}[source]
    It goes kinda dsytopic if access to the network becomes a monopolistic barrier.
    replies(1): >>41896451 #
    12. mrpippy ◴[] No.41890885[source]
    Tesla officially renamed it to “Full Self Driving (supervised)” a few months ago, previously it was “Full Self Driving (beta)”

    Both names are ridiculous, for different reasons. Nothing called a “beta” should be tested on public roads without a trained employee supervising it (i.e. being paid to pay attention). And of course it was not “full”, it always required supervision.

    And “Full Self Driving (supervised)” is an absurd oxymoron. Given the deaths and crashes that we’ve already seen, I’m skeptical of the entire concept of a system that works 98% of the time, but also needs to be closely supervised for the 2% of the time when it tries to kill you or others (with no alerts).

    It’s an abdication of duty that NHTSA has let this continue for so long, they’ve picked up the pace recently and I wouldn’t be surprised if they come down hard on Tesla (unless Trump wins, in which case Elon will be put in charge of NHTSA, the SEC, and FAA)

    replies(1): >>41892629 #
    13. KoolKat23 ◴[] No.41891750{4}[source]
    Who said management and shareholders are in it for the long run. Plenty of examples where businesses are purely run in the short term. Bonuses and stock pumps.
    14. ilyagr ◴[] No.41892629[source]
    I hope they soon rename it into "Fully Supervised Driving".
    15. awongh ◴[] No.41893587[source]
    Also force other auto makers to be liable when their over-tall SUVs cause more deaths than sedan type cars.
    16. kelnos ◴[] No.41894294{4}[source]
    Being in a vehicle that collides with someone and kills them is going to be traumatic regardless of whether or not you're driving.

    But it's almost certainly going to be more traumatic and more guilt-inducing if you are driving.

    If I only had two choices, I would much rather my car kill someone than I kill someone with my car. I'm gonna feel bad about it either way, but one is much worse than the other.

    17. kelnos ◴[] No.41894328[source]
    > It’s your car, so ultimately the liability is yours

    No, that's not how it works. The driver and the driver's insurer are on the hook when something bad happens. The owner is not, except when the owner is also the one driving, or if the owner has been negligent with maintenance, and the crash was caused by mechanical failure related to that negligence.

    If someone else is driving my car and I'm a passenger, and they hurt someone with it, the driver is liable, not me. If that "someone else" is a piece of software, and that piece of software has been licensed/certified/whatever to drive a car, why should I be liable for its failures? That piece of software needs to be insured, certainly. It doesn't matter if I'm required to insure it, or if the manufacturer is required to insure it.

    Tesla FSD doesn't fit into this scenario because it's not the driver. You are still the driver when you engage FSD, because despite its name, FSD is not capable of filling that role.

    replies(1): >>41903503 #
    18. tmtvl ◴[] No.41896451{4}[source]
    Not to mention the possibility of requiring pedestrians and cyclists to also be connected to the same network. Anyone with access to the automotive network could track any pedestrian who passes by the vicinity of a road.
    replies(1): >>41899155 #
    19. Terr_ ◴[] No.41899155{5}[source]
    It's hard to think of a good blend of traffic safety, privacy guarantees, and resistance to bad-actors. Having/avoiding persistent identification is certainly a factor.

    Perhaps one approach would be to declare that automated systems are responsible for determining the position/speed of everything around them using regular sensors, but may elect to take hints from anonymous "notice me" marks or beacons.

    20. iknowstuff ◴[] No.41899496{3}[source]
    no need.
    21. SoftTalker ◴[] No.41903503{3}[source]
    Incorrect. Or at least, it varies by state. I was visiting my mother and borrowed her car, had a minor accident with it. Her insurance paid, not mine.

    This is why you are required to have insurance for the cars you own. You may from time to time be driving cars you do not own, and the owners of those cars are required to have insurance for those cars, not you.