←back to thread

410 points jjulius | 4 comments | | HN request time: 0.805s | source
Show context
bastawhiz ◴[] No.41889192[source]
Lots of people are asking how good the self driving has to be before we tolerate it. I got a one month free trial of FSD and turned it off after two weeks. Quite simply: it's dangerous.

- It failed with a cryptic system error while driving

- It started making a left turn far too early that would have scraped the left side of the car on a sign. I had to manually intervene.

- In my opinion, the default setting accelerates way too aggressively. I'd call myself a fairly aggressive driver and it is too aggressive for my taste.

- It tried to make way too many right turns on red when it wasn't safe to. It would creep into the road, almost into the path of oncoming vehicles.

- It didn't merge left to make room for vehicles merging onto the highway. The vehicles then tried to cut in. The system should have avoided an unsafe situation like this in the first place.

- It would switch lanes to go faster on the highway, but then missed an exit on at least one occasion because it couldn't make it back into the right lane in time. Stupid.

After the system error, I lost all trust in FSD from Tesla. Until I ride in one and feel safe, I can't have any faith that this is a reasonable system. Hell, even autopilot does dumb shit on a regular basis. I'm grateful to be getting a car from another manufacturer this year.

replies(24): >>41889213 #>>41889323 #>>41889348 #>>41889518 #>>41889642 #>>41890213 #>>41890238 #>>41890342 #>>41890380 #>>41890407 #>>41890729 #>>41890785 #>>41890801 #>>41891175 #>>41892569 #>>41894279 #>>41894644 #>>41894722 #>>41894770 #>>41894964 #>>41895150 #>>41895291 #>>41895301 #>>41902130 #
TheCleric ◴[] No.41890342[source]
> Lots of people are asking how good the self driving has to be before we tolerate it.

There’s a simple answer to this. As soon as it’s good enough for Tesla to accept liability for accidents. Until then if Tesla doesn’t trust it, why should I?

replies(9): >>41890435 #>>41890716 #>>41890927 #>>41891560 #>>41892829 #>>41894269 #>>41894342 #>>41894760 #>>41896173 #
bdcravens ◴[] No.41890927[source]
The liability for killing someone can include prison time.
replies(3): >>41891164 #>>41894710 #>>41896926 #
TheCleric ◴[] No.41891164[source]
Good. If you write software that people rely on with their lives, and it fails, you should be held liable for that criminally.
replies(11): >>41891445 #>>41891631 #>>41891844 #>>41891890 #>>41892022 #>>41892572 #>>41894610 #>>41894812 #>>41895100 #>>41895710 #>>41896899 #
viraptor ◴[] No.41892572[source]
That's a dangerous line and I don't think it's correct. Software I write shouldn't be relied on in critical situations. If someone makes that decision then it's on them not on me.

The line should be where a person tells others that they can rely on the software with their lives - as in the integrator for the end product. Even if I was working on the software for self driving, the same thing would apply - if I wrote some alpha level stuff for the internal demonstration and some manager decided "good enough, ship it", they should be liable for that decision. (Because I wouldn't be able to stop them / may have already left by then)

replies(3): >>41892970 #>>41893594 #>>41895839 #
kergonath ◴[] No.41893594[source]
It’s not that complicated or outlandish. That’s how most engineering fields work. If a building collapses because of design flaws, then the builders and architects can be held responsible. Hell, if a car crashes because of a design or assembly flaw, the manufacturer is held responsible. Why should self-driving software be any different?

If the software is not reliable enough, then don’t use it in a context where it could kill people.

replies(1): >>41894185 #
krisoft ◴[] No.41894185[source]
I think the example here is that the designer draws a bridge for a railway model, and someone decides to use the same design and sends real locomotives across it. Is the original designer (who neither intended nor could have foreseen this) liable in your understanding?
replies(3): >>41894354 #>>41894366 #>>41894816 #
ndsipa_pomu ◴[] No.41894354[source]
That's a ridiculous argument.

If a construction firm takes an arbitrary design and then tries to build it in a totally different environment and for a different purpose, then the construction firm is liable, not the original designer. It'd be like Boeing taking a child's paper aeroplane design and making a passenger jet out of it and then blaming the child when it inevitably fails.

replies(3): >>41894574 #>>41894653 #>>41895101 #
1. krisoft ◴[] No.41895101[source]
> That's a ridiculous argument.

Not making an argument. Asking a clarifying question about someone else’s.

> It'd be like Boeing taking a child's paper aeroplane design and making a passenger jet out of it and then blaming the child when it inevitably fails.

Yes exactly. You are using the same example I used to say the same thing. So which part of my message was ridiculous?

replies(1): >>41895440 #
2. ndsipa_pomu ◴[] No.41895440[source]
If it's not an argument, then you're just misrepresenting your parent poster's comment by introducing a scenario that never happens.

If you didn't intend your comment as a criticism, then you phrased it poorly. Do you actually believe that your scenario happens in reality?

replies(2): >>41895781 #>>41897990 #
3. lcnPylGDnU4H9OF ◴[] No.41895781[source]
It was not a misrepresentation of anything. They were just restating the worry that was stated in the GP comment. https://news.ycombinator.com/item?id=41892572

And the only reason the commenter I linked to had that response is because its parent comment was slightly careless in its phrasing. Probably just change “write” to “deploy” to capture the intended meaning.

4. krisoft ◴[] No.41897990[source]
> you're just misrepresenting your parent poster's comment

I did not represent or misrepresent anything. I have asked a question to better understand their thinking.

> If you didn't intend your comment as a criticism, then you phrased it poorly.

Quite probably. I will have to meditate on it.

> Do you actually believe that your scenario happens in reality?

With railway bridges? Never. It would ring alarm bells for everyone from the fabricators to the locomotive engineer.

With software? All the time. Someone publishes some open source code, someone else at a corporation bolts the open source code into some application and now the former “toy train bridge” is a loadbearing key-component of something the original developer could never imagine nor plan for.

This is not theoretical. Very often I’m the one doing the bolting.

And to be clear: my opinion is that the liability should fall with whoever integrated the code and certified it to be fit for some safety critical purpose. As an example if you publish leftpad and i put it into a train brake controller it is my job to make sure it is doing the right thing. If the train crashes you as the author of leftpad bear no responsibility but me as the manufacturer of discount train brakes do.