Most active commenters
  • viraptor(5)
  • ywvcbk(5)
  • kergonath(5)
  • A4ET8a8uTh0(5)
  • krisoft(4)
  • _rm(3)
  • ekianjo(3)
  • f1shy(3)
  • chgs(3)
  • freejazz(3)

←back to thread

410 points jjulius | 87 comments | | HN request time: 0.004s | source | bottom
Show context
bastawhiz ◴[] No.41889192[source]
Lots of people are asking how good the self driving has to be before we tolerate it. I got a one month free trial of FSD and turned it off after two weeks. Quite simply: it's dangerous.

- It failed with a cryptic system error while driving

- It started making a left turn far too early that would have scraped the left side of the car on a sign. I had to manually intervene.

- In my opinion, the default setting accelerates way too aggressively. I'd call myself a fairly aggressive driver and it is too aggressive for my taste.

- It tried to make way too many right turns on red when it wasn't safe to. It would creep into the road, almost into the path of oncoming vehicles.

- It didn't merge left to make room for vehicles merging onto the highway. The vehicles then tried to cut in. The system should have avoided an unsafe situation like this in the first place.

- It would switch lanes to go faster on the highway, but then missed an exit on at least one occasion because it couldn't make it back into the right lane in time. Stupid.

After the system error, I lost all trust in FSD from Tesla. Until I ride in one and feel safe, I can't have any faith that this is a reasonable system. Hell, even autopilot does dumb shit on a regular basis. I'm grateful to be getting a car from another manufacturer this year.

replies(24): >>41889213 #>>41889323 #>>41889348 #>>41889518 #>>41889642 #>>41890213 #>>41890238 #>>41890342 #>>41890380 #>>41890407 #>>41890729 #>>41890785 #>>41890801 #>>41891175 #>>41892569 #>>41894279 #>>41894644 #>>41894722 #>>41894770 #>>41894964 #>>41895150 #>>41895291 #>>41895301 #>>41902130 #
TheCleric ◴[] No.41890342[source]
> Lots of people are asking how good the self driving has to be before we tolerate it.

There’s a simple answer to this. As soon as it’s good enough for Tesla to accept liability for accidents. Until then if Tesla doesn’t trust it, why should I?

replies(9): >>41890435 #>>41890716 #>>41890927 #>>41891560 #>>41892829 #>>41894269 #>>41894342 #>>41894760 #>>41896173 #
bdcravens ◴[] No.41890927[source]
The liability for killing someone can include prison time.
replies(3): >>41891164 #>>41894710 #>>41896926 #
1. TheCleric ◴[] No.41891164[source]
Good. If you write software that people rely on with their lives, and it fails, you should be held liable for that criminally.
replies(11): >>41891445 #>>41891631 #>>41891844 #>>41891890 #>>41892022 #>>41892572 #>>41894610 #>>41894812 #>>41895100 #>>41895710 #>>41896899 #
2. beej71 ◴[] No.41891445[source]
And such coders should carry malpractice insurance.
3. dmix ◴[] No.41891631[source]
Drug companies and the FDA (circa 1906) play a very dangerous and delicate dance all the time releasing new drugs to the public. But for over a century now we've managed to figure it out without holding pharma companies criminally liable for every death.

> If you write software that people rely on with their lives, and it fails, you should be held liable for that criminally.

Easy to type those words on the internet than make it a policy IRL. That sort of policy IRL would likely result in a) killing off all commercial efforts to solve traffic deaths via technology and vast amounts of other semi-autonomous technology like farm equipment or b) government/car companies mandating filming the driver every time they turn it on, because it's technically supposed to be human assisted autopilot in these testing stages (outside restricted pilot programs like Waymo taxis). Those distinctions would matter in a criminal court room, even if humans can't always be relied upon to always follow the instructions on the bottle's label.

replies(3): >>41892028 #>>41892069 #>>41893456 #
4. dansiemens ◴[] No.41891844[source]
Are you suggesting that individuals should carry that liability?
replies(1): >>41893851 #
5. _rm ◴[] No.41891890[source]
What a laugh, would you take that deal?

Upside: you get paid a 200k salary, if all your code works perfectly. Downside: if it doesn't, you go to prison.

The users aren't compelled to use it. They can choose not to. They get to choose their own risks.

The internet is a gold mine of creatively moronic opinions.

replies(3): >>41892070 #>>41892279 #>>41894907 #
6. bdcravens ◴[] No.41892022[source]
Assuming there's the kind of guard rails as in other industries where this is true, absolutely. (In other words, proper licensing and credentialing, and the ability to prevent a deployment legally)

I would also say that if something gets signed off on by management, that carries an implicit transfer of accountability up the chain from the individual contributor to whoever signed off.

7. ryandrake ◴[] No.41892028[source]
Your take is understandable and not surprising on a site full of software developers. Somehow, the general software industry has ingrained this pessimistic and fatalistic dogma that says bugs are inevitable and there’s nothing you can do to prevent them. Since everyone believes it, it is a self-fulfilling prophecy and we just accept it as some kind of law of nature.

Holding software developers (or their companies) liable for defects would definitely kill off a part of the industry: the very large part that YOLOs code into production and races to get features released without rigorous and exhaustive testing. And why don’t they spend 90% of their time testing and verifying and proving their software has no defects? Because defects are inevitable and they’re not held accountable for them!

replies(4): >>41892592 #>>41892653 #>>41893464 #>>41893804 #
8. hilsdev ◴[] No.41892069[source]
We should hold Pharma companies liable for every death. They make money off the success cases. Not doing so is another example of privatized profits and socialized risks/costs. Something like a program with reduced costs for those willing to sign away liability to help balance social good vs risk analysis
9. moralestapia ◴[] No.41892070[source]
Read the site rules.

And also, of course some people would take that deal, and of course some others wouldn't. Your argument is moot.

10. thunky ◴[] No.41892279[source]
You can go to prison or die for being a bad driver, yet people choose to drive.
replies(2): >>41892668 #>>41893006 #
11. viraptor ◴[] No.41892572[source]
That's a dangerous line and I don't think it's correct. Software I write shouldn't be relied on in critical situations. If someone makes that decision then it's on them not on me.

The line should be where a person tells others that they can rely on the software with their lives - as in the integrator for the end product. Even if I was working on the software for self driving, the same thing would apply - if I wrote some alpha level stuff for the internal demonstration and some manager decided "good enough, ship it", they should be liable for that decision. (Because I wouldn't be able to stop them / may have already left by then)

replies(3): >>41892970 #>>41893594 #>>41895839 #
12. viraptor ◴[] No.41892592{3}[source]
> that says bugs are inevitable and there’s nothing you can do to prevent them

I don't think people believe this as such. It may be the short way to write it, but actually what devs mean is "bugs are inevitable at the funding/time available". I often say "bugs are inevitable" when it practice it means "you're not going to pay a team for formal specification, validated implementation and enough reliable hardware".

Which business will agree to making the process 5x longer and require extra people? Especially if they're not forced there by regulation or potential liability?

13. everforward ◴[] No.41892653{3}[source]
It is true of every field I can think of. Food gets salmonella and what not frequently. Surgeons forget sponges inside of people (and worse). Truckers run over cars. Manufacturers miss some failures in QA.

Literally everywhere else, we accept that the costs of 100% safety are just unreasonably high. People would rather have a mostly safe device for $1 than a definitely safe one for $5. No one wants to pay to have every head of lettuce tested for E Coli, or truckers to drive at 10mph so they can’t kill anyone.

Software isn’t different. For the vast majority of applications where the costs of failure are low to none, people want it to be free and rapidly iterated on even if it fails. No one wants to pay for a formally verified Facebook or DoorDash.

replies(1): >>41893620 #
14. ukuina ◴[] No.41892668{3}[source]
Systems evolve to handle such liability: Drivers pass theory and practical tests to get licensed to drive (and periodically thereafter), and an insurance framework that gauges your risk-level and charges you accordingly.
replies(2): >>41893635 #>>41894827 #
15. presentation ◴[] No.41892970[source]
To be fair maybe the software you write shouldn’t be relied on in critical situations but in this case the only place this software could be used in are critical situations
replies(1): >>41893226 #
16. _rm ◴[] No.41893006{3}[source]
Arguing for the sake of it; you wouldn't take that risk reward.

Most code has bugs from time to time even when highly skilled developers are being careful. None of them would drive if the fault rate was similar and the outcome was death.

replies(2): >>41894194 #>>41897174 #
17. viraptor ◴[] No.41893226{3}[source]
Ultimately - yes. But as I mentioned, the fact it's sold as ready for critical situations doesn't mean the developers thought/said it's ready.
replies(2): >>41893722 #>>41893726 #
18. ywvcbk ◴[] No.41893456[source]
> criminally liable for every death.

The fact that people generally consume drugs voluntarily and make that decision after being informed about most of the known risks probably mitigates that to some extent. Being killed by someone else’s FSD car seems to be very different

replies(2): >>41893905 #>>41894860 #
19. ywvcbk ◴[] No.41893464{3}[source]
Punishing individual developers is of course absurd (unless intent can be proven) the company itself and the upper management on the hand? Would make perfect sense.
replies(1): >>41894921 #
20. kergonath ◴[] No.41893594[source]
It’s not that complicated or outlandish. That’s how most engineering fields work. If a building collapses because of design flaws, then the builders and architects can be held responsible. Hell, if a car crashes because of a design or assembly flaw, the manufacturer is held responsible. Why should self-driving software be any different?

If the software is not reliable enough, then don’t use it in a context where it could kill people.

replies(1): >>41894185 #
21. kergonath ◴[] No.41893620{4}[source]
> Literally everywhere else, we accept that the costs of 100% safety are just unreasonably high.

Yes, but also in none of these situations would the consumer/customer/patient be held responsible. I don’t expect a system to be perfect, but I won’t accept any liability if it malfunctions as I use it the way it is intended. And even worse, I would not accept that the designers evade their responsibilities if it kills someone I know.

As the other poster said, I am happy to consider it safe enough the day the company accepts to own its issues and the associated responsibility.

> No one wants to pay for a formally verified Facebook or DoorDash.

This is untenable. Does nobody want a formally verified avionics system in their airliner, either?

replies(1): >>41895466 #
22. kergonath ◴[] No.41893635{4}[source]
Requiring formal licensing and possibly insurance for developers working on life-critical systems is not that outlandish. On the contrary, that is already the case in serious engineering fields.
23. elric ◴[] No.41893722{4}[source]
I think it should be fairly obvious that it's not the individual developers who are responsible/liable. In critical systems there is a whole chain of liability. That one guy in Nebraska who thanklessly maintains some open source lib that BigCorp is using in their car should obviously not be liable.
replies(1): >>41894847 #
24. gmueckl ◴[] No.41893726{4}[source]
But someone slapped that label on it and made a pinky promise that it's true. That person needs to accept liability if things go wrong. If person A is loud and clear that something isn't ready, but person B tells the customer otherwise, B is at fault.

Look, there are well established procedures in a lot of industries where products are relied on to keep people safe. They all require quite rigorous development and certification processes and sneaking untested alpha quality software through such a process would be actively malicious and quite possibly criminal in and of itself, at least in some industries.

replies(1): >>41893832 #
25. tsimionescu ◴[] No.41893804{3}[source]
> And why don’t they spend 90% of their time testing and verifying and proving their software has no defects? Because defects are inevitable and they’re not held accountable for them!

For a huge part of the industry, the reason is entirely different. It is because software that mostly works today but has defects is much more valuable than software that always works and has no defects 10 years from now. Extremely well informed business customers will pay for delivering a buggy feature today rather than wait two more months for a comprehensively tested feature. This is the reality of the majority of the industry: consumers care little about bugs (below some defect rate) and care far more about timeliness.

This of course doesn't apply to critical systems like automatic drivers or medical devices. But the vast majority of the industry is not building these types of systems.

26. viraptor ◴[] No.41893832{5}[source]
This is the beginning of the thread https://news.ycombinator.com/item?id=41891164

You're in violent agreement with me ;)

replies(1): >>41893935 #
27. izacus ◴[] No.41893851[source]
The ones that are identified as making decisions leading to death, yes.

It's completely normal in other fields where engineers build systems that can kill.

replies(2): >>41894849 #>>41901038 #
28. sokoloff ◴[] No.41893905{3}[source]
Imagine that in 2031, FSD cars could exactly halve all aspects of auto crashes (minor, major, single car, multi car, vs pedestrian, fatal/non, etc.)

Would you want FSD software to be developed or not? If you do, do you think holding devs or companies criminally liable for half of all crashes is the best way to ensure that progress happens?

replies(2): >>41894272 #>>41895047 #
29. latexr ◴[] No.41893935{6}[source]
No, the beginning of the thread is earlier. And with that context it seems clear to me that the “you” in the post you linked means “the company”, not “the individual software developer”. No one else in your replies seems confused by that, we all understand self-driving software wasn’t written by a single person that has ultimate decision power within a company.
replies(1): >>41894186 #
30. krisoft ◴[] No.41894185{3}[source]
I think the example here is that the designer draws a bridge for a railway model, and someone decides to use the same design and sends real locomotives across it. Is the original designer (who neither intended nor could have foreseen this) liable in your understanding?
replies(3): >>41894354 #>>41894366 #>>41894816 #
31. viraptor ◴[] No.41894186{7}[source]
If the message said "you release software", or "approve" or "produce", or something like that, sure. But it said "you write software" - and I don't think that can apply to a company, because writing is what individuals do. But yeah, maybe that's not what the author meant.
replies(1): >>41894422 #
32. notahacker ◴[] No.41894194{4}[source]
Or to put even more straightforwardly: people who choose to drive rarely expect to drive more than a few 10s of k per year. People who choose to write autonomous software's lines of code potentially drive a billion miles per year, experiencing a lot more edge cases they are expected to handle in a non-dangerous manner, and have to handle them via advance planning and interactions with a lot of other people's code.

The only practical way around this which permits autonomous vehicles (which are apparently dependent on much more complex and intractable codebases than, say, avionics) is a much higher threshold of criminal responsibility than the "the serious consequences resulted from the one-off execution of an dangerous manoeuvre which couldn't be justified in context" which sends human drivers to jail. And of course that double standard will be problematic if "willingness to accept liability" is the only safety threshold.

33. blackoil ◴[] No.41894272{4}[source]
Say cars have near 0 casualty in northern hemisphere but occasionally fails for cars driving topsy turvy in south. If company knew about it and chooses to ignore it because of profits, yes they should be charged criminally.
34. ndsipa_pomu ◴[] No.41894354{4}[source]
That's a ridiculous argument.

If a construction firm takes an arbitrary design and then tries to build it in a totally different environment and for a different purpose, then the construction firm is liable, not the original designer. It'd be like Boeing taking a child's paper aeroplane design and making a passenger jet out of it and then blaming the child when it inevitably fails.

replies(3): >>41894574 #>>41894653 #>>41895101 #
35. kergonath ◴[] No.41894366{4}[source]
Someone, at some point signed off on this being released. Not thinking things through seriously is not an excuse to sell defective cars.
36. latexr ◴[] No.41894422{8}[source]
> and I don't think that can apply to a company, because writing is what individuals do.

By that token, no action could ever apply to a company—including approving, producing, or releasing—since it is a legal entity, a concept, not a physical thing. For all those actions there was a person actually doing it in the name of the company.

It’s perfectly normal to say, for example, “GenericCorp wrote a press-release about their new product”.

37. ◴[] No.41894574{5}[source]
38. mensetmanusman ◴[] No.41894610[source]
Software requires hardware that can bit flip with gamma rays.
replies(3): >>41894643 #>>41894885 #>>41894887 #
39. wongarsu ◴[] No.41894653{5}[source]
Or alternatively, if Boeing uses wood screws to attach an airplane door and the screw fails that's on Boeing, not the airline, pilot or screw manufacturer. But if it's sold as aerospace-grade attachment bolt with attachments for safety wire and a spec sheet that suggests the required loads are within design parameters then it's the bolt manufacturers fault when it fails, and they might have to answer for any deaths resulting from that. Unless Boeing knew or should have known that the bolts weren't actually as good as claimed, then the buck passes back to them

Of course that's wildly oversimplifying and multiple entities can be at fault at once. My point is that these are normal things considered in regular engineering and manufacturing

40. ekianjo ◴[] No.41894812[source]
How is that working with Boeing?
replies(1): >>41895001 #
41. f1shy ◴[] No.41894816{4}[source]
Are you serious?! You must be trolling!
replies(1): >>41895151 #
42. ekianjo ◴[] No.41894827{4}[source]
And yet tens of thousands of people die on the roads right now every year. Working well?
43. f1shy ◴[] No.41894847{5}[source]
It depends. If you do bad sw and skip reviews and processes, you may be liable. Even if you are told to do something, if you know is wrong, you should say it. Right now I’m in middle of s*t because of I spoked up.
replies(1): >>41896160 #
44. A4ET8a8uTh0 ◴[] No.41894849{3}[source]
Pretty much. Fuck. I just watched higher ups sign off on a project I know for a fact has defects all over the place going into production despite our very explicit: don't do it ( not quite Tesla level consequences, but still resulting in real issues for real people ). The sooner we can start having people in jail for knowingly approving half-baked software, the sooner it will improve.
replies(1): >>41895257 #
45. ekianjo ◴[] No.41894860{3}[source]
> make that decision after being informed about most of the known risks

Like for the COVID-19 vaccines? Experimental yet given to billions without ever showing them a consent form.

replies(1): >>41895076 #
46. aaronmdjones ◴[] No.41894885[source]
Which is why hardware used to run safety-critical software is made redundant.

Take the Boeing 777 Primary Flight Computer for example. This is a fully digital fly-by-wire aircraft. There are 3 separate racks of equipment housing identical flight computers; 2 in the avionics bay underneath the flight deck, 1 in the aft cargo section. Each flight computer has 3 separate processors, supporting 2 dissimilar instruction set architectures, running the same software built by 3 separate compilers. Each flight computer captures instances of the software not agreeing about an action to be undertaken and wins by majority vote. The processor that makes these decisions is different in each flight computer.

The power systems that provide each flight computer are also fully redundant; each computer gets power from a power supply assembly, which receives 2 power feeds from 3 separate power supplies; no 2 power supply assemblies share the same 2 sources of power. 2 of the 3 power systems (L engine generator, R engine generator, and the hot battery bus) would have to fail and the APU would have to be unavailable in order to knock out 1 of the 3 computers.

This system has never failed in 30 years of service. There's still a primary flight computer disconnect switch on the overhead panel in the cockpit, taking the software out of the loop, to logically connect all of your control inputs to the flight surface actuators. I'm not aware of it ever being used (edit: in a commercial flight).

replies(1): >>41895814 #
47. chgs ◴[] No.41894887[source]
You can control for that. Multiple machines doing is rival calculations for example
48. chgs ◴[] No.41894907[source]
Need far more regulation of the software industry, far too many people working in it fail to understand the scope of what they do.

Civil engineer kills someone with a bad building, jail. Surgeon removes the wrong lung, jail. Computer programmer kills someone, “oh well it’s your own fault”.

replies(2): >>41895200 #>>41903272 #
49. chgs ◴[] No.41894921{4}[source]
You have one person in that RACI accountable box. That’s the engineer signing it off as fit. They are held accountable, including with jail if required.
50. mlinhares ◴[] No.41895001[source]
People often forget corporations don’t go to jail. Murder when you’re not a person ends up with a slap.
51. ywvcbk ◴[] No.41895047{4}[source]
From a utilitarian perspective sure, you might be right but how do you exempt those companies from civil liability and make it impossible for victims/their families to sue the manufacturer? Might be legally tricky (driver/owner can explicitly/implicitly agree with the EULA or other agreements, imposing that on third parties wouldn’t be right).
replies(1): >>41895366 #
52. ywvcbk ◴[] No.41895076{4}[source]
Yes, but worse. Nobody physically forced anyone to get vaccinated so you still had some choice. Of course legally banning individuals from using public roads or sidewalks unless they give up their right to sue Tesla/etc. might be an option.
53. bossyTeacher ◴[] No.41895100[source]
Doesn't seem to happen in the medical and airplane industries, otherwise, Boeing would most likely not exist as a company anymore.
replies(1): >>41895177 #
54. krisoft ◴[] No.41895101{5}[source]
> That's a ridiculous argument.

Not making an argument. Asking a clarifying question about someone else’s.

> It'd be like Boeing taking a child's paper aeroplane design and making a passenger jet out of it and then blaming the child when it inevitably fails.

Yes exactly. You are using the same example I used to say the same thing. So which part of my message was ridiculous?

replies(1): >>41895440 #
55. krisoft ◴[] No.41895151{5}[source]
I assure you I am not trolling. You appear to have misread my message.

Take a deep breath. Read my message one more time carefully. Notice the question mark at the end of the last sentence. Think about it. If after that you still think I’m trolling you or anyone else I will be here and happy to respond to your further questions.

56. jsvlrtmred ◴[] No.41895177[source]
Perhaps one can debate whether it happens often enough or severely enough, but it certainly happens. For example, and only the first one to come to mind - the president of PIP went to jail.
57. caddemon ◴[] No.41895200{3}[source]
I've never heard of a surgeon going to jail over a genuine mistake even if it did kill someone. I'm also not sure what that would accomplish - take away their license to practice medicine sure, but they're not a threat to society more broadly.
58. IX-103 ◴[] No.41895257{4}[source]
Should we require Professional Engineers to sign off on such projects the same way they are required to for other safety critical infrastructure (like bridges and dams)? The Professional Engineer that signed off is liable for defects in the design. (Though, of course, if the design is not followed then liability can shift back to the company that built it)
replies(1): >>41898367 #
59. Majromax ◴[] No.41895366{5}[source]
> how do you exempt those companies from civil liability and make it impossible for victims/their families to sue the manufacturer?

I don't think anyone in this thread has talked about an exemption from civil liability (sue for money), just criminal liability (go to jail).

Civil liability is the far less controversial issue because it's transferred all the time: governments even mandate that drivers carry insurance for this purpose.

With civil liability transfer, imperfect FSD can still make economic sense. Just as an insurance company needs to collect enough premium to pay claims, the FSD manufacturer would need to reserve enough revenue to pay its expected claims. In this case, FSD doesn't even need to be better than humans to make economic sense, in the same way that bad drivers can still buy (expensive) insurance.

replies(2): >>41895467 #>>41895767 #
60. ndsipa_pomu ◴[] No.41895440{6}[source]
If it's not an argument, then you're just misrepresenting your parent poster's comment by introducing a scenario that never happens.

If you didn't intend your comment as a criticism, then you phrased it poorly. Do you actually believe that your scenario happens in reality?

replies(2): >>41895781 #>>41897990 #
61. everforward ◴[] No.41895466{5}[source]
You could be held liable if it impacts someone else. A restaurant serving improperly cooked chicken that gives people E Coli is liable. Private citizens may not have that duty, I’m not sure.

You would likely also be liable if you overloaded an electrical cable, causing a fire that killed someone.

“Using it in the way it was intended” is largely circular reasoning; of course it wasn’t intended to hurt anyone, so any usage that does hurt someone was clearly unintended. People frequently harm each other by misusing items in ways they didn’t realize were misuses.

> This is untenable. Does nobody want a formally verified avionics system in their airliner, either?

Not for the price it would cost. Airbus is the pioneer here, and even they apply formal verification sparingly. Here’s a paper from a few years ago about it, and how it’s untenable to formally verify the whole thing: https://www.di.ens.fr/~delmas/papers/fm09.pdf

Software development effort generally tends to scale superlinearly with complexity. I am not an expert, but the impression I get is that formal verification grows exponentially with complexity to the point that it is untenable for most things beyond research and fairly simple problems. It is a huge pain in the ass to do something like putting time bounds around reading a config file.

IO also sucks in formal verification from what I hear, and that’s like 80% of what a plane does. Read these 300 signals, do some standard math, output new signals to controls.

These things are much easier to do with tests, but tests only check for scenarios you’ve thought of already

replies(1): >>41898516 #
62. ywvcbk ◴[] No.41895467{6}[source]
> just criminal liability (go to jail).

That just seems like a theoretical possibility (even if that). I don’t see how any engineer or even someone in management could go to jail unless intent or gross negligence can be proven.

> drivers carry insurance for this purpose.

The mandatory limit is extremely low in many US states.

> expected claims

That seems like the problem. It might take a while until we reach an equilibrium of some sort.

> that bad drivers can still buy

That’s still capped by the amount of coverage + total assets held by that bad driver. In Tesl’s case there is no real limit (without legislation/established precedent). Juries/courts would likely be influenced by that fact as well.

63. hibikir ◴[] No.41895710[source]
Remember that this is neural networks doing the driving, more than old expert systems: What makes a crash happen is a network that fails to read an image correctly, or a network that fails to capture what is going on when melding input from different sensors.

So the blame won't be on a guy who got an if statement backwards, but signing off on stopping training, failing to have certain kinds of pictures in the set, or other similar, higher order problem. Blame will be incredibly nebulous.

replies(1): >>41902224 #
64. DennisP ◴[] No.41895767{6}[source]
In fact, if you buy your insurance from Tesla, you effectively do put civil responsibility for FSD back in their hands.
65. lcnPylGDnU4H9OF ◴[] No.41895781{7}[source]
It was not a misrepresentation of anything. They were just restating the worry that was stated in the GP comment. https://news.ycombinator.com/item?id=41892572

And the only reason the commenter I linked to had that response is because its parent comment was slightly careless in its phrasing. Probably just change “write” to “deploy” to capture the intended meaning.

66. mensetmanusman ◴[] No.41895814{3}[source]
You can’t guarantee the hardware was properly built.
replies(1): >>41895873 #
67. sigh_again ◴[] No.41895839[source]
>Software I write shouldn't be relied on in critical situations.

Then don't write software to be used in things that are literally always critical situations, like cars.

68. aaronmdjones ◴[] No.41895873{4}[source]
Unless Intel, Motorola, and AMD all conspire to give you a faulty processor, you will get a working primary flight computer.

Besides, this is what flight testing is for. Aviation certification authorities don't let an aircraft serve passengers unless you can demonstrate that all of its safety-critical systems work properly and that it performs as described.

I find it hard to believe that automotive works much differently in this regard, which is what things like crumple zone crash tests are for.

69. Filligree ◴[] No.41896160{6}[source]
> Right now I’m in middle of s*t because of I spoked up.

And you believe that, despite experiencing what happens if you speak up?

We shouldn’t simultaneously require people to take heroic responsibility, while also leaving them high and dry if they do.

replies(1): >>41896521 #
70. f1shy ◴[] No.41896521{7}[source]
I do believe I am responsible. I recognize I’am now in a position that I can speak without fear. If I get fired I would make a party tbh.
71. sashank_1509 ◴[] No.41896899[source]
Do we send Boeing engineers to jail when their plane crashes?

Intention matters when passing crime judgement. If a mother causes the death of her baby due to some poor decision (say feed her something contaminated), no one proposes or tries to jail the mother, because they know the intention was the opposite.

replies(1): >>41901773 #
72. 7sidedmarble ◴[] No.41897174{4}[source]
I don't think anyone's seriously suggesting people be held accountable for bugs which are ultimately accidents. But if you knowingly sign off on, oversea, or are otherwise directly responsible for the construction of software that you know has a good chance of killing people, then yes, there should be consequences for that.
replies(1): >>41903564 #
73. krisoft ◴[] No.41897990{7}[source]
> you're just misrepresenting your parent poster's comment

I did not represent or misrepresent anything. I have asked a question to better understand their thinking.

> If you didn't intend your comment as a criticism, then you phrased it poorly.

Quite probably. I will have to meditate on it.

> Do you actually believe that your scenario happens in reality?

With railway bridges? Never. It would ring alarm bells for everyone from the fabricators to the locomotive engineer.

With software? All the time. Someone publishes some open source code, someone else at a corporation bolts the open source code into some application and now the former “toy train bridge” is a loadbearing key-component of something the original developer could never imagine nor plan for.

This is not theoretical. Very often I’m the one doing the bolting.

And to be clear: my opinion is that the liability should fall with whoever integrated the code and certified it to be fit for some safety critical purpose. As an example if you publish leftpad and i put it into a train brake controller it is my job to make sure it is doing the right thing. If the train crashes you as the author of leftpad bear no responsibility but me as the manufacturer of discount train brakes do.

74. A4ET8a8uTh0 ◴[] No.41898367{5}[source]
I hesitate, because I shudder at government deciding which algorithm is best for a given scenario ( because that is effectively is where it would go ). Maybe the distinction is, the moment money changes hands based on product?

I am not an engineer, but I have watched clearly bad decisions take place from technical perspective so that a person with title that went to their head and a bonus that is not aligned with right incentives mess things up for us. Maybe some proffesionalization of software engineering is in order.

replies(1): >>41902255 #
75. kergonath ◴[] No.41898516{6}[source]
> You could be held liable if it impacts someone else. A restaurant serving improperly cooked chicken that gives people E Coli is liable. Private citizens may not have that duty, I’m not sure. > You would likely also be liable if you overloaded an electrical cable, causing a fire that killed someone.

Right. But neither of these examples are following guidelines or proper use. If I turn the car into people on the pavement, I am responsible. If the steering wheel breaks and the car does it, then the manufacturer is responsible (or the mechanic, if the steering wheel was changed). The question at hand is whose responsibility it is if the car’s software does it.

> “Using it in the way it was intended” is largely circular reasoning; of course it wasn’t intended to hurt anyone, so any usage that does hurt someone was clearly unintended.

This is puzzling. You seem to be conflating use and consequences and I am not quite sure how you read that in what I wrote. Using a device normally should not make it kill people, I guess at least we can agree on that. Therefore, if a device kills people, then it is either improper use (and the fault of the user), or a defective device, at which point it is the fault of the designer or manufacturer (or whoever did the maintenance, as the case might be, but that’s irrelevant in this case).

Each device has a manual and a bunch of regulations about its expected behaviour and standard operating procedures. There is nothing circular about it.

> Not for the price it would cost.

Ok, if you want to go full pedantic, note that I wrote “want”, not “expect”.

76. Dylan16807 ◴[] No.41901038{3}[source]
That's liability for defective design, not any time it fails as suggested above.
77. davkan ◴[] No.41901773[source]
This is why we have criminal negligence. Did the mother open a sealed package from the grocery store or did she find an open one on the ground?

Harder to apply to software but maybe there should be a some legal liability involved when a sysadmin uses admin/admin and health information is leaked.

Some employees should be absolutely in jail from boeing regarding the MCAS system and the hundreds of people who died as a result. But the actions there go beyond negligence anyway.

78. snovv_crash ◴[] No.41902224[source]
This is the difference between a Professional Engineer (ie. the protected term) and everyone else who calls themselves engineers. They can put their signature on a system that would then hold them criminally liable if it fails.

Bridges, elevators, buildings, ski lifts etc. all require a professional engineer to sign off on them before they can be built. Maybe self driving cars need the same treatment.

79. snovv_crash ◴[] No.41902255{6}[source]
This isn't a matter of the government saying what you need to do. This is a matter of being held criminally liable if people get hurt.
replies(1): >>41903713 #
80. _rm ◴[] No.41903272{3}[source]
You made all that up out of nothing. They'd only go to jail if it was intentional.

The only case where a computer programmer "kills someone" is where he hacks into a system and interferes with it in a way that foreseeably leads to someone's death.

Otherwise, the user voluntarily assumed the risk.

Frankly if someone lets a computer drive their car, given their own ample experiences of computers "crashing", it's basically a form of attempted suicide.

81. thunky ◴[] No.41903564{5}[source]
Exactly. Just like most car accidents don't result in prison or death. But negligence or recklessness can do it.
82. A4ET8a8uTh0 ◴[] No.41903713{7}[source]
You are only technically correct. And even then, in terms of civics, by having people held criminally liable government is telling you what to do ( or technically not do ). Note that no other body can ( legally ) do it. In fact, false imprisonment is in itself a punishable offense, but I digress..

Now, we could argue over whether that is/should/could/would be the law of the land, but have you considered how it would be enforced?

I mean, I can tell you first hand what it looks like, when government gives you a vague law for an industry to figure out and an enforcement agency with a broad mandate.

That said, I may have exaggerated a little bit on the algo choice. I was shooting for ghoulish overkill.

replies(1): >>41905305 #
83. freejazz ◴[] No.41905305{8}[source]
> You are only technically correct

You clearly have no idea how civil liability works. At all.

replies(1): >>41905548 #
84. A4ET8a8uTh0 ◴[] No.41905548{9}[source]
I am here to learn. You can help me by educating me. I do mean it sincerely. If you think you have a grasp on the subject, I think HN as a whole could benefit from your expertise.
replies(1): >>41905564 #
85. freejazz ◴[] No.41905564{10}[source]
Civil liability isn't determined by the "gov't" it's determined by a jury of your peers. More interesting to me is how you came to the impression that you had any idea what you were talking about to the point you felt justified in making your post.
replies(1): >>41905827 #
86. A4ET8a8uTh0 ◴[] No.41905827{11}[source]
My friend. Thank you. It is not often I get to be myself lately. Allow me to retort in kind.

Your original response to my response was in turn a response to the following sentence by "snovv_crash":

"This isn't a matter of the government saying what you need to do. This is a matter of being held criminally liable if people get hurt."

I do want to point out that from the beginning the narrow scope of this argument defined the type of liability as criminal and not civil as your post suggested. In other words, your whole point kinda falls apart as I was not talking about civil liability, but about the connection of civics and government's ( or society's depending on your philosophical bent ) monopoly on violence.

It is possible that the word civic threw you off, but I was merely referring to the study of the rights, duties, and obligations of citizens in a society. Surely, you would agree that writing code that kills people would be under the purview of the rights, duties and obligations of individuals in a society?

In either case, I am not sure what are you arguing for here, It is not just that you are wrong, but you seem to be oddly focused on trying to .. not even sure. Maybe I should ask you instead.

<<More interesting to me is how you came to the impression that you had any idea what you were talking about to the point you felt justified in making your post.

Yes, good question. Now that I replied I feel it would not be a bad idea ( edit: for you ) to present why you feel ( and I use that verb consciously ) you can just throw salad willy-nilly not only with confidence, but, clearly, justification worthy of a justicar.

tldr: You are wrong, but can you even accept that you are wrong.. now that will be an interesting thing to see.

<< that you had any idea

I am a guy on the internet man. No one has any idea about anything. Cheer up:D

replies(1): >>41906548 #
87. freejazz ◴[] No.41906548{12}[source]
In a criminal court, guilt (not liability) is also determined by a jury of your peers, and not the gov't.