←back to thread

410 points jjulius | 2 comments | | HN request time: 0.207s | source
Show context
bastawhiz ◴[] No.41889192[source]
Lots of people are asking how good the self driving has to be before we tolerate it. I got a one month free trial of FSD and turned it off after two weeks. Quite simply: it's dangerous.

- It failed with a cryptic system error while driving

- It started making a left turn far too early that would have scraped the left side of the car on a sign. I had to manually intervene.

- In my opinion, the default setting accelerates way too aggressively. I'd call myself a fairly aggressive driver and it is too aggressive for my taste.

- It tried to make way too many right turns on red when it wasn't safe to. It would creep into the road, almost into the path of oncoming vehicles.

- It didn't merge left to make room for vehicles merging onto the highway. The vehicles then tried to cut in. The system should have avoided an unsafe situation like this in the first place.

- It would switch lanes to go faster on the highway, but then missed an exit on at least one occasion because it couldn't make it back into the right lane in time. Stupid.

After the system error, I lost all trust in FSD from Tesla. Until I ride in one and feel safe, I can't have any faith that this is a reasonable system. Hell, even autopilot does dumb shit on a regular basis. I'm grateful to be getting a car from another manufacturer this year.

replies(24): >>41889213 #>>41889323 #>>41889348 #>>41889518 #>>41889642 #>>41890213 #>>41890238 #>>41890342 #>>41890380 #>>41890407 #>>41890729 #>>41890785 #>>41890801 #>>41891175 #>>41892569 #>>41894279 #>>41894644 #>>41894722 #>>41894770 #>>41894964 #>>41895150 #>>41895291 #>>41895301 #>>41902130 #
TheCleric ◴[] No.41890342[source]
> Lots of people are asking how good the self driving has to be before we tolerate it.

There’s a simple answer to this. As soon as it’s good enough for Tesla to accept liability for accidents. Until then if Tesla doesn’t trust it, why should I?

replies(9): >>41890435 #>>41890716 #>>41890927 #>>41891560 #>>41892829 #>>41894269 #>>41894342 #>>41894760 #>>41896173 #
bdcravens ◴[] No.41890927[source]
The liability for killing someone can include prison time.
replies(3): >>41891164 #>>41894710 #>>41896926 #
TheCleric ◴[] No.41891164[source]
Good. If you write software that people rely on with their lives, and it fails, you should be held liable for that criminally.
replies(11): >>41891445 #>>41891631 #>>41891844 #>>41891890 #>>41892022 #>>41892572 #>>41894610 #>>41894812 #>>41895100 #>>41895710 #>>41896899 #
viraptor ◴[] No.41892572[source]
That's a dangerous line and I don't think it's correct. Software I write shouldn't be relied on in critical situations. If someone makes that decision then it's on them not on me.

The line should be where a person tells others that they can rely on the software with their lives - as in the integrator for the end product. Even if I was working on the software for self driving, the same thing would apply - if I wrote some alpha level stuff for the internal demonstration and some manager decided "good enough, ship it", they should be liable for that decision. (Because I wouldn't be able to stop them / may have already left by then)

replies(3): >>41892970 #>>41893594 #>>41895839 #
presentation ◴[] No.41892970[source]
To be fair maybe the software you write shouldn’t be relied on in critical situations but in this case the only place this software could be used in are critical situations
replies(1): >>41893226 #
viraptor ◴[] No.41893226[source]
Ultimately - yes. But as I mentioned, the fact it's sold as ready for critical situations doesn't mean the developers thought/said it's ready.
replies(2): >>41893722 #>>41893726 #
elric ◴[] No.41893722[source]
I think it should be fairly obvious that it's not the individual developers who are responsible/liable. In critical systems there is a whole chain of liability. That one guy in Nebraska who thanklessly maintains some open source lib that BigCorp is using in their car should obviously not be liable.
replies(1): >>41894847 #
f1shy ◴[] No.41894847[source]
It depends. If you do bad sw and skip reviews and processes, you may be liable. Even if you are told to do something, if you know is wrong, you should say it. Right now I’m in middle of s*t because of I spoked up.
replies(1): >>41896160 #
1. Filligree ◴[] No.41896160[source]
> Right now I’m in middle of s*t because of I spoked up.

And you believe that, despite experiencing what happens if you speak up?

We shouldn’t simultaneously require people to take heroic responsibility, while also leaving them high and dry if they do.

replies(1): >>41896521 #
2. f1shy ◴[] No.41896521[source]
I do believe I am responsible. I recognize I’am now in a position that I can speak without fear. If I get fired I would make a party tbh.