Most active commenters
  • _rm(3)

←back to thread

410 points jjulius | 13 comments | | HN request time: 0.598s | source | bottom
Show context
bastawhiz ◴[] No.41889192[source]
Lots of people are asking how good the self driving has to be before we tolerate it. I got a one month free trial of FSD and turned it off after two weeks. Quite simply: it's dangerous.

- It failed with a cryptic system error while driving

- It started making a left turn far too early that would have scraped the left side of the car on a sign. I had to manually intervene.

- In my opinion, the default setting accelerates way too aggressively. I'd call myself a fairly aggressive driver and it is too aggressive for my taste.

- It tried to make way too many right turns on red when it wasn't safe to. It would creep into the road, almost into the path of oncoming vehicles.

- It didn't merge left to make room for vehicles merging onto the highway. The vehicles then tried to cut in. The system should have avoided an unsafe situation like this in the first place.

- It would switch lanes to go faster on the highway, but then missed an exit on at least one occasion because it couldn't make it back into the right lane in time. Stupid.

After the system error, I lost all trust in FSD from Tesla. Until I ride in one and feel safe, I can't have any faith that this is a reasonable system. Hell, even autopilot does dumb shit on a regular basis. I'm grateful to be getting a car from another manufacturer this year.

replies(24): >>41889213 #>>41889323 #>>41889348 #>>41889518 #>>41889642 #>>41890213 #>>41890238 #>>41890342 #>>41890380 #>>41890407 #>>41890729 #>>41890785 #>>41890801 #>>41891175 #>>41892569 #>>41894279 #>>41894644 #>>41894722 #>>41894770 #>>41894964 #>>41895150 #>>41895291 #>>41895301 #>>41902130 #
TheCleric ◴[] No.41890342[source]
> Lots of people are asking how good the self driving has to be before we tolerate it.

There’s a simple answer to this. As soon as it’s good enough for Tesla to accept liability for accidents. Until then if Tesla doesn’t trust it, why should I?

replies(9): >>41890435 #>>41890716 #>>41890927 #>>41891560 #>>41892829 #>>41894269 #>>41894342 #>>41894760 #>>41896173 #
bdcravens ◴[] No.41890927[source]
The liability for killing someone can include prison time.
replies(3): >>41891164 #>>41894710 #>>41896926 #
TheCleric ◴[] No.41891164[source]
Good. If you write software that people rely on with their lives, and it fails, you should be held liable for that criminally.
replies(11): >>41891445 #>>41891631 #>>41891844 #>>41891890 #>>41892022 #>>41892572 #>>41894610 #>>41894812 #>>41895100 #>>41895710 #>>41896899 #
1. _rm ◴[] No.41891890[source]
What a laugh, would you take that deal?

Upside: you get paid a 200k salary, if all your code works perfectly. Downside: if it doesn't, you go to prison.

The users aren't compelled to use it. They can choose not to. They get to choose their own risks.

The internet is a gold mine of creatively moronic opinions.

replies(3): >>41892070 #>>41892279 #>>41894907 #
2. moralestapia ◴[] No.41892070[source]
Read the site rules.

And also, of course some people would take that deal, and of course some others wouldn't. Your argument is moot.

3. thunky ◴[] No.41892279[source]
You can go to prison or die for being a bad driver, yet people choose to drive.
replies(2): >>41892668 #>>41893006 #
4. ukuina ◴[] No.41892668[source]
Systems evolve to handle such liability: Drivers pass theory and practical tests to get licensed to drive (and periodically thereafter), and an insurance framework that gauges your risk-level and charges you accordingly.
replies(2): >>41893635 #>>41894827 #
5. _rm ◴[] No.41893006[source]
Arguing for the sake of it; you wouldn't take that risk reward.

Most code has bugs from time to time even when highly skilled developers are being careful. None of them would drive if the fault rate was similar and the outcome was death.

replies(2): >>41894194 #>>41897174 #
6. kergonath ◴[] No.41893635{3}[source]
Requiring formal licensing and possibly insurance for developers working on life-critical systems is not that outlandish. On the contrary, that is already the case in serious engineering fields.
7. notahacker ◴[] No.41894194{3}[source]
Or to put even more straightforwardly: people who choose to drive rarely expect to drive more than a few 10s of k per year. People who choose to write autonomous software's lines of code potentially drive a billion miles per year, experiencing a lot more edge cases they are expected to handle in a non-dangerous manner, and have to handle them via advance planning and interactions with a lot of other people's code.

The only practical way around this which permits autonomous vehicles (which are apparently dependent on much more complex and intractable codebases than, say, avionics) is a much higher threshold of criminal responsibility than the "the serious consequences resulted from the one-off execution of an dangerous manoeuvre which couldn't be justified in context" which sends human drivers to jail. And of course that double standard will be problematic if "willingness to accept liability" is the only safety threshold.

8. ekianjo ◴[] No.41894827{3}[source]
And yet tens of thousands of people die on the roads right now every year. Working well?
9. chgs ◴[] No.41894907[source]
Need far more regulation of the software industry, far too many people working in it fail to understand the scope of what they do.

Civil engineer kills someone with a bad building, jail. Surgeon removes the wrong lung, jail. Computer programmer kills someone, “oh well it’s your own fault”.

replies(2): >>41895200 #>>41903272 #
10. caddemon ◴[] No.41895200[source]
I've never heard of a surgeon going to jail over a genuine mistake even if it did kill someone. I'm also not sure what that would accomplish - take away their license to practice medicine sure, but they're not a threat to society more broadly.
11. 7sidedmarble ◴[] No.41897174{3}[source]
I don't think anyone's seriously suggesting people be held accountable for bugs which are ultimately accidents. But if you knowingly sign off on, oversea, or are otherwise directly responsible for the construction of software that you know has a good chance of killing people, then yes, there should be consequences for that.
replies(1): >>41903564 #
12. _rm ◴[] No.41903272[source]
You made all that up out of nothing. They'd only go to jail if it was intentional.

The only case where a computer programmer "kills someone" is where he hacks into a system and interferes with it in a way that foreseeably leads to someone's death.

Otherwise, the user voluntarily assumed the risk.

Frankly if someone lets a computer drive their car, given their own ample experiences of computers "crashing", it's basically a form of attempted suicide.

13. thunky ◴[] No.41903564{4}[source]
Exactly. Just like most car accidents don't result in prison or death. But negligence or recklessness can do it.