←back to thread

410 points jjulius | 1 comments | | HN request time: 0.001s | source
Show context
bastawhiz ◴[] No.41889192[source]
Lots of people are asking how good the self driving has to be before we tolerate it. I got a one month free trial of FSD and turned it off after two weeks. Quite simply: it's dangerous.

- It failed with a cryptic system error while driving

- It started making a left turn far too early that would have scraped the left side of the car on a sign. I had to manually intervene.

- In my opinion, the default setting accelerates way too aggressively. I'd call myself a fairly aggressive driver and it is too aggressive for my taste.

- It tried to make way too many right turns on red when it wasn't safe to. It would creep into the road, almost into the path of oncoming vehicles.

- It didn't merge left to make room for vehicles merging onto the highway. The vehicles then tried to cut in. The system should have avoided an unsafe situation like this in the first place.

- It would switch lanes to go faster on the highway, but then missed an exit on at least one occasion because it couldn't make it back into the right lane in time. Stupid.

After the system error, I lost all trust in FSD from Tesla. Until I ride in one and feel safe, I can't have any faith that this is a reasonable system. Hell, even autopilot does dumb shit on a regular basis. I'm grateful to be getting a car from another manufacturer this year.

replies(24): >>41889213 #>>41889323 #>>41889348 #>>41889518 #>>41889642 #>>41890213 #>>41890238 #>>41890342 #>>41890380 #>>41890407 #>>41890729 #>>41890785 #>>41890801 #>>41891175 #>>41892569 #>>41894279 #>>41894644 #>>41894722 #>>41894770 #>>41894964 #>>41895150 #>>41895291 #>>41895301 #>>41902130 #
TheCleric ◴[] No.41890342[source]
> Lots of people are asking how good the self driving has to be before we tolerate it.

There’s a simple answer to this. As soon as it’s good enough for Tesla to accept liability for accidents. Until then if Tesla doesn’t trust it, why should I?

replies(9): >>41890435 #>>41890716 #>>41890927 #>>41891560 #>>41892829 #>>41894269 #>>41894342 #>>41894760 #>>41896173 #
bdcravens ◴[] No.41890927[source]
The liability for killing someone can include prison time.
replies(3): >>41891164 #>>41894710 #>>41896926 #
TheCleric ◴[] No.41891164[source]
Good. If you write software that people rely on with their lives, and it fails, you should be held liable for that criminally.
replies(11): >>41891445 #>>41891631 #>>41891844 #>>41891890 #>>41892022 #>>41892572 #>>41894610 #>>41894812 #>>41895100 #>>41895710 #>>41896899 #
_rm ◴[] No.41891890[source]
What a laugh, would you take that deal?

Upside: you get paid a 200k salary, if all your code works perfectly. Downside: if it doesn't, you go to prison.

The users aren't compelled to use it. They can choose not to. They get to choose their own risks.

The internet is a gold mine of creatively moronic opinions.

replies(3): >>41892070 #>>41892279 #>>41894907 #
thunky ◴[] No.41892279{3}[source]
You can go to prison or die for being a bad driver, yet people choose to drive.
replies(2): >>41892668 #>>41893006 #
ukuina ◴[] No.41892668{4}[source]
Systems evolve to handle such liability: Drivers pass theory and practical tests to get licensed to drive (and periodically thereafter), and an insurance framework that gauges your risk-level and charges you accordingly.
replies(2): >>41893635 #>>41894827 #
1. kergonath ◴[] No.41893635{5}[source]
Requiring formal licensing and possibly insurance for developers working on life-critical systems is not that outlandish. On the contrary, that is already the case in serious engineering fields.