←back to thread

410 points jjulius | 9 comments | | HN request time: 0s | source | bottom
Show context
bastawhiz ◴[] No.41889192[source]
Lots of people are asking how good the self driving has to be before we tolerate it. I got a one month free trial of FSD and turned it off after two weeks. Quite simply: it's dangerous.

- It failed with a cryptic system error while driving

- It started making a left turn far too early that would have scraped the left side of the car on a sign. I had to manually intervene.

- In my opinion, the default setting accelerates way too aggressively. I'd call myself a fairly aggressive driver and it is too aggressive for my taste.

- It tried to make way too many right turns on red when it wasn't safe to. It would creep into the road, almost into the path of oncoming vehicles.

- It didn't merge left to make room for vehicles merging onto the highway. The vehicles then tried to cut in. The system should have avoided an unsafe situation like this in the first place.

- It would switch lanes to go faster on the highway, but then missed an exit on at least one occasion because it couldn't make it back into the right lane in time. Stupid.

After the system error, I lost all trust in FSD from Tesla. Until I ride in one and feel safe, I can't have any faith that this is a reasonable system. Hell, even autopilot does dumb shit on a regular basis. I'm grateful to be getting a car from another manufacturer this year.

replies(24): >>41889213 #>>41889323 #>>41889348 #>>41889518 #>>41889642 #>>41890213 #>>41890238 #>>41890342 #>>41890380 #>>41890407 #>>41890729 #>>41890785 #>>41890801 #>>41891175 #>>41892569 #>>41894279 #>>41894644 #>>41894722 #>>41894770 #>>41894964 #>>41895150 #>>41895291 #>>41895301 #>>41902130 #
TheCleric ◴[] No.41890342[source]
> Lots of people are asking how good the self driving has to be before we tolerate it.

There’s a simple answer to this. As soon as it’s good enough for Tesla to accept liability for accidents. Until then if Tesla doesn’t trust it, why should I?

replies(9): >>41890435 #>>41890716 #>>41890927 #>>41891560 #>>41892829 #>>41894269 #>>41894342 #>>41894760 #>>41896173 #
bdcravens ◴[] No.41890927[source]
The liability for killing someone can include prison time.
replies(3): >>41891164 #>>41894710 #>>41896926 #
TheCleric ◴[] No.41891164[source]
Good. If you write software that people rely on with their lives, and it fails, you should be held liable for that criminally.
replies(11): >>41891445 #>>41891631 #>>41891844 #>>41891890 #>>41892022 #>>41892572 #>>41894610 #>>41894812 #>>41895100 #>>41895710 #>>41896899 #
dmix ◴[] No.41891631[source]
Drug companies and the FDA (circa 1906) play a very dangerous and delicate dance all the time releasing new drugs to the public. But for over a century now we've managed to figure it out without holding pharma companies criminally liable for every death.

> If you write software that people rely on with their lives, and it fails, you should be held liable for that criminally.

Easy to type those words on the internet than make it a policy IRL. That sort of policy IRL would likely result in a) killing off all commercial efforts to solve traffic deaths via technology and vast amounts of other semi-autonomous technology like farm equipment or b) government/car companies mandating filming the driver every time they turn it on, because it's technically supposed to be human assisted autopilot in these testing stages (outside restricted pilot programs like Waymo taxis). Those distinctions would matter in a criminal court room, even if humans can't always be relied upon to always follow the instructions on the bottle's label.

replies(3): >>41892028 #>>41892069 #>>41893456 #
1. ywvcbk ◴[] No.41893456[source]
> criminally liable for every death.

The fact that people generally consume drugs voluntarily and make that decision after being informed about most of the known risks probably mitigates that to some extent. Being killed by someone else’s FSD car seems to be very different

replies(2): >>41893905 #>>41894860 #
2. sokoloff ◴[] No.41893905[source]
Imagine that in 2031, FSD cars could exactly halve all aspects of auto crashes (minor, major, single car, multi car, vs pedestrian, fatal/non, etc.)

Would you want FSD software to be developed or not? If you do, do you think holding devs or companies criminally liable for half of all crashes is the best way to ensure that progress happens?

replies(2): >>41894272 #>>41895047 #
3. blackoil ◴[] No.41894272[source]
Say cars have near 0 casualty in northern hemisphere but occasionally fails for cars driving topsy turvy in south. If company knew about it and chooses to ignore it because of profits, yes they should be charged criminally.
4. ekianjo ◴[] No.41894860[source]
> make that decision after being informed about most of the known risks

Like for the COVID-19 vaccines? Experimental yet given to billions without ever showing them a consent form.

replies(1): >>41895076 #
5. ywvcbk ◴[] No.41895047[source]
From a utilitarian perspective sure, you might be right but how do you exempt those companies from civil liability and make it impossible for victims/their families to sue the manufacturer? Might be legally tricky (driver/owner can explicitly/implicitly agree with the EULA or other agreements, imposing that on third parties wouldn’t be right).
replies(1): >>41895366 #
6. ywvcbk ◴[] No.41895076[source]
Yes, but worse. Nobody physically forced anyone to get vaccinated so you still had some choice. Of course legally banning individuals from using public roads or sidewalks unless they give up their right to sue Tesla/etc. might be an option.
7. Majromax ◴[] No.41895366{3}[source]
> how do you exempt those companies from civil liability and make it impossible for victims/their families to sue the manufacturer?

I don't think anyone in this thread has talked about an exemption from civil liability (sue for money), just criminal liability (go to jail).

Civil liability is the far less controversial issue because it's transferred all the time: governments even mandate that drivers carry insurance for this purpose.

With civil liability transfer, imperfect FSD can still make economic sense. Just as an insurance company needs to collect enough premium to pay claims, the FSD manufacturer would need to reserve enough revenue to pay its expected claims. In this case, FSD doesn't even need to be better than humans to make economic sense, in the same way that bad drivers can still buy (expensive) insurance.

replies(2): >>41895467 #>>41895767 #
8. ywvcbk ◴[] No.41895467{4}[source]
> just criminal liability (go to jail).

That just seems like a theoretical possibility (even if that). I don’t see how any engineer or even someone in management could go to jail unless intent or gross negligence can be proven.

> drivers carry insurance for this purpose.

The mandatory limit is extremely low in many US states.

> expected claims

That seems like the problem. It might take a while until we reach an equilibrium of some sort.

> that bad drivers can still buy

That’s still capped by the amount of coverage + total assets held by that bad driver. In Tesl’s case there is no real limit (without legislation/established precedent). Juries/courts would likely be influenced by that fact as well.

9. DennisP ◴[] No.41895767{4}[source]
In fact, if you buy your insurance from Tesla, you effectively do put civil responsibility for FSD back in their hands.