Most active commenters

    ←back to thread

    410 points jjulius | 11 comments | | HN request time: 0.001s | source | bottom
    Show context
    bastawhiz ◴[] No.41889192[source]
    Lots of people are asking how good the self driving has to be before we tolerate it. I got a one month free trial of FSD and turned it off after two weeks. Quite simply: it's dangerous.

    - It failed with a cryptic system error while driving

    - It started making a left turn far too early that would have scraped the left side of the car on a sign. I had to manually intervene.

    - In my opinion, the default setting accelerates way too aggressively. I'd call myself a fairly aggressive driver and it is too aggressive for my taste.

    - It tried to make way too many right turns on red when it wasn't safe to. It would creep into the road, almost into the path of oncoming vehicles.

    - It didn't merge left to make room for vehicles merging onto the highway. The vehicles then tried to cut in. The system should have avoided an unsafe situation like this in the first place.

    - It would switch lanes to go faster on the highway, but then missed an exit on at least one occasion because it couldn't make it back into the right lane in time. Stupid.

    After the system error, I lost all trust in FSD from Tesla. Until I ride in one and feel safe, I can't have any faith that this is a reasonable system. Hell, even autopilot does dumb shit on a regular basis. I'm grateful to be getting a car from another manufacturer this year.

    replies(24): >>41889213 #>>41889323 #>>41889348 #>>41889518 #>>41889642 #>>41890213 #>>41890238 #>>41890342 #>>41890380 #>>41890407 #>>41890729 #>>41890785 #>>41890801 #>>41891175 #>>41892569 #>>41894279 #>>41894644 #>>41894722 #>>41894770 #>>41894964 #>>41895150 #>>41895291 #>>41895301 #>>41902130 #
    TheCleric ◴[] No.41890342[source]
    > Lots of people are asking how good the self driving has to be before we tolerate it.

    There’s a simple answer to this. As soon as it’s good enough for Tesla to accept liability for accidents. Until then if Tesla doesn’t trust it, why should I?

    replies(9): >>41890435 #>>41890716 #>>41890927 #>>41891560 #>>41892829 #>>41894269 #>>41894342 #>>41894760 #>>41896173 #
    ndsipa_pomu ◴[] No.41894342[source]
    > As soon as it’s good enough for Tesla to accept liability for accidents.

    That makes a lot of sense and not just from a selfish point of view. When a person drives a vehicle, then the person is held responsible for how the vehicle behaves on the roads, so it's logical that when a machine drives a vehicle that the machine's manufacturer/designer is held responsible.

    It's a complete con that Tesla is promoting their autonomous driving, but also having their vehicles suddenly switch to non-autonomous driving which they claim moves the responsibility to the human in the driver seat. Presumably, the idea is that the human should have been watching and approving everything that the vehicle has done up to that point.

    replies(2): >>41894666 #>>41894794 #
    1. andrewaylett ◴[] No.41894666[source]
    The responsibility doesn't shift, it always lies with the human. One problem is that humans are notoriously poor at maintaining attention when supervising automation

    Until the car is ready to take over as legal driver, it's foolish to set the human driver up for failure in the way that Tesla (and the humans driving Tesla cars) do.

    replies(2): >>41894801 #>>41896371 #
    2. f1shy ◴[] No.41894801[source]
    What?! So if there is a failure and the car goes full throttle (no autonomous car) it is my responsibility?! You are pretty wrong!!!
    replies(3): >>41895481 #>>41895527 #>>41906386 #
    3. kgermino ◴[] No.41895481[source]
    You are responsible (Legally, contractually, morally) for supervising FSD today. If the car decided to stomp on the throttle you are expected to be ready to hit the brakes.

    The whole point is that is somewhat of an unreasonable expectation but it’s what Tesla expects you to do today

    replies(2): >>41896164 #>>41896283 #
    4. xondono ◴[] No.41895527[source]
    Autopilot, FSD, etc.. are all legally classified as ADAS, so it’s different from e.g. your car not responding to controls.

    The liability lies with the driver, and all Tesla needs to prove is that input from the driver will override any decision made by the ADAS.

    5. f1shy ◴[] No.41896164{3}[source]
    My example was clear about NOT about autonomous driving. Because the previous comment seems to imply for everything you are responsible
    6. FireBeyond ◴[] No.41896283{3}[source]
    > If the car decided to stomp on the throttle you are expected to be ready to hit the brakes.

    Didn't Tesla have an issue a couple of years ago where pressing the brake did not disengage any throttle? i.e. if the car has a bug and puts throttle to 100% and you stand on the brake, the car should say "cut throttle to 0", but instead, you just had 100% throttle, 100% brake?

    replies(1): >>41897359 #
    7. mannykannot ◴[] No.41896371[source]
    > The responsibility doesn't shift, it always lies with the human.

    Indeed, and that goes for the person or persons who say that the products they sell are safe when used in a certain way.

    8. blackeyeblitzar ◴[] No.41897359{4}[source]
    If it did, it wouldn’t matter. Brakes are required to be stronger than engines.
    replies(1): >>41897796 #
    9. FireBeyond ◴[] No.41897796{5}[source]
    That makes no sense. Yes, they are. But brakes are going to be more reactive and performant with the throttle at 0 than 100.

    You can't imagine that the stopping distances will be the same.

    10. andrewaylett ◴[] No.41906386[source]
    The point at which we decide that a defect is serious enough to transfer liability is quite case-dependent. If you knew that the throttle was glitchy but hadn't done anything to fix it, yes. If it affected every car from the manufacturer, it's obviously their fault -- but if you ignore the recall then it might be your fault again?

    In this case, the behaviour of the system and the responsibility of the driver is well-established. I'd actually quite like it if Tesla were held responsible for their software, but they somehow continue to skirt the line wherein they require the driver to retain vigilance and any system failures are therefore the (legal) fault of the human not the car despite advertising it as "Full Self Driving".

    replies(1): >>41906722 #
    11. dragonwriter ◴[] No.41906722{3}[source]
    > The point at which we decide that a defect is serious enough to transfer liability is quite case-dependent. If you knew that the throttle was glitchy but hadn't done anything to fix it, yes. If it affected every car from the manufacturer, it's obviously their fault -- but if you ignore the recall then it might be your fault again?

    In most American jurisdictions' liability law, the more usual thing is to expand liability, rather than transferring liability. The idea that exactly one -- or at most one -- person or entity should be liable for any given portion of any given harm is a common popular one in places like HN, but the law is much more accepting of the situation where lots of people may have overlapping liability for the same harm, with none relieving the others.

    The liability of a driver for maintenance and operation within the law is not categorically mutually exclusive with the liability of the manufacturer (and, indeed, every party in the chain of commerce) for manufacturing defects.

    If a car is driven in a way that violates the rules of the road and causes an accident and a manufacturing defect in a driver assistance system contributed to that, it is quite possible for the driver, manufacturer of the driver assistance system, manufacturer of the vehicle (if different from that of the assistance system) and seller of the vehicle to the driver (if different from the last two), among others, to all be fully liable to those injured for the harms.