Most active commenters
  • ywvcbk(9)
  • f1shy(9)
  • ndsipa_pomu(6)
  • A4ET8a8uTh0(6)
  • viraptor(5)
  • kergonath(5)
  • krisoft(4)
  • TheCleric(3)
  • moooo99(3)
  • ekianjo(3)

←back to thread

410 points jjulius | 148 comments | | HN request time: 0.944s | source | bottom
Show context
bastawhiz ◴[] No.41889192[source]
Lots of people are asking how good the self driving has to be before we tolerate it. I got a one month free trial of FSD and turned it off after two weeks. Quite simply: it's dangerous.

- It failed with a cryptic system error while driving

- It started making a left turn far too early that would have scraped the left side of the car on a sign. I had to manually intervene.

- In my opinion, the default setting accelerates way too aggressively. I'd call myself a fairly aggressive driver and it is too aggressive for my taste.

- It tried to make way too many right turns on red when it wasn't safe to. It would creep into the road, almost into the path of oncoming vehicles.

- It didn't merge left to make room for vehicles merging onto the highway. The vehicles then tried to cut in. The system should have avoided an unsafe situation like this in the first place.

- It would switch lanes to go faster on the highway, but then missed an exit on at least one occasion because it couldn't make it back into the right lane in time. Stupid.

After the system error, I lost all trust in FSD from Tesla. Until I ride in one and feel safe, I can't have any faith that this is a reasonable system. Hell, even autopilot does dumb shit on a regular basis. I'm grateful to be getting a car from another manufacturer this year.

replies(24): >>41889213 #>>41889323 #>>41889348 #>>41889518 #>>41889642 #>>41890213 #>>41890238 #>>41890342 #>>41890380 #>>41890407 #>>41890729 #>>41890785 #>>41890801 #>>41891175 #>>41892569 #>>41894279 #>>41894644 #>>41894722 #>>41894770 #>>41894964 #>>41895150 #>>41895291 #>>41895301 #>>41902130 #
1. TheCleric ◴[] No.41890342[source]
> Lots of people are asking how good the self driving has to be before we tolerate it.

There’s a simple answer to this. As soon as it’s good enough for Tesla to accept liability for accidents. Until then if Tesla doesn’t trust it, why should I?

replies(9): >>41890435 #>>41890716 #>>41890927 #>>41891560 #>>41892829 #>>41894269 #>>41894342 #>>41894760 #>>41896173 #
2. genocidicbunny ◴[] No.41890435[source]
I think this is probably both the most concise and most reasonable take. It doesn't require anyone to define some level of autonomy or argue about specific edge cases of how the self driving system behaves. And it's easy to apply this principle to not only Tesla, but to all companies making self driving cars and similar features.
3. concordDance ◴[] No.41890716[source]
Whats the current total liability cost for all Tesla drivers?

The average for all USA cars seems to be around $2000/year, so even if FSD was half as dangerous Tesla would still be paying $1000/year equivalent (not sure how big insurance margins are, assuming nominal) per car.

Now, if legally the driver could avoid paying insurance for the few times they want/need to drive themselves (e.g. snow? Dunno what FSD supports atm) then it might make sense economically, but otherwise I don't think it would work out.

replies(2): >>41890796 #>>41893401 #
4. Retric ◴[] No.41890796[source]
Liability alone isn’t nearly that high.

Car insurance payments include people stealing your car, uninsured motorists, rental cars, and other issues not the drivers fault. Further insurance payments also include profits for the insurance company, advertising, billing, and other overhead from running a business.

Also, if Tesla was taking on these risks you’d expect your insurance costs to drop.

replies(3): >>41890817 #>>41890872 #>>41893427 #
5. TheCleric ◴[] No.41890817{3}[source]
Yeah any automaker doing this would just negotiate a flat rate per car in the US and the insurer would average the danger to make a rate. This would be much cheaper than the average individual’s cost for liability on their insurance.
replies(3): >>41892045 #>>41892322 #>>41893444 #
6. concordDance ◴[] No.41890872{3}[source]
Good points, thanks.
7. bdcravens ◴[] No.41890927[source]
The liability for killing someone can include prison time.
replies(3): >>41891164 #>>41894710 #>>41896926 #
8. TheCleric ◴[] No.41891164[source]
Good. If you write software that people rely on with their lives, and it fails, you should be held liable for that criminally.
replies(11): >>41891445 #>>41891631 #>>41891844 #>>41891890 #>>41892022 #>>41892572 #>>41894610 #>>41894812 #>>41895100 #>>41895710 #>>41896899 #
9. beej71 ◴[] No.41891445{3}[source]
And such coders should carry malpractice insurance.
10. renewiltord ◴[] No.41891560[source]
This is how I feel about nuclear energy. Every single plant should need to form a full insurance fund dedicated to paying out if there’s trouble. And the plant should have strict liability: anything that happens from materials it releases are its responsibility.

But people get upset about this. We need corporations to take responsibility.

replies(2): >>41891771 #>>41894412 #
11. dmix ◴[] No.41891631{3}[source]
Drug companies and the FDA (circa 1906) play a very dangerous and delicate dance all the time releasing new drugs to the public. But for over a century now we've managed to figure it out without holding pharma companies criminally liable for every death.

> If you write software that people rely on with their lives, and it fails, you should be held liable for that criminally.

Easy to type those words on the internet than make it a policy IRL. That sort of policy IRL would likely result in a) killing off all commercial efforts to solve traffic deaths via technology and vast amounts of other semi-autonomous technology like farm equipment or b) government/car companies mandating filming the driver every time they turn it on, because it's technically supposed to be human assisted autopilot in these testing stages (outside restricted pilot programs like Waymo taxis). Those distinctions would matter in a criminal court room, even if humans can't always be relied upon to always follow the instructions on the bottle's label.

replies(3): >>41892028 #>>41892069 #>>41893456 #
12. idiotsecant ◴[] No.41891771[source]
While we're at it how about why apply the same standard to coal and natural gas plants? For some reason when we start taking about nuclear plants we all of a sudden become adverse to the idea of unfunded externalities but when we're talking about 'old' tech that has been steadily irradiating your community and changing the gas composition of the entire planet it becomes less concerning.
replies(2): >>41894020 #>>41895652 #
13. dansiemens ◴[] No.41891844{3}[source]
Are you suggesting that individuals should carry that liability?
replies(1): >>41893851 #
14. _rm ◴[] No.41891890{3}[source]
What a laugh, would you take that deal?

Upside: you get paid a 200k salary, if all your code works perfectly. Downside: if it doesn't, you go to prison.

The users aren't compelled to use it. They can choose not to. They get to choose their own risks.

The internet is a gold mine of creatively moronic opinions.

replies(3): >>41892070 #>>41892279 #>>41894907 #
15. bdcravens ◴[] No.41892022{3}[source]
Assuming there's the kind of guard rails as in other industries where this is true, absolutely. (In other words, proper licensing and credentialing, and the ability to prevent a deployment legally)

I would also say that if something gets signed off on by management, that carries an implicit transfer of accountability up the chain from the individual contributor to whoever signed off.

16. ryandrake ◴[] No.41892028{4}[source]
Your take is understandable and not surprising on a site full of software developers. Somehow, the general software industry has ingrained this pessimistic and fatalistic dogma that says bugs are inevitable and there’s nothing you can do to prevent them. Since everyone believes it, it is a self-fulfilling prophecy and we just accept it as some kind of law of nature.

Holding software developers (or their companies) liable for defects would definitely kill off a part of the industry: the very large part that YOLOs code into production and races to get features released without rigorous and exhaustive testing. And why don’t they spend 90% of their time testing and verifying and proving their software has no defects? Because defects are inevitable and they’re not held accountable for them!

replies(4): >>41892592 #>>41892653 #>>41893464 #>>41893804 #
17. ryandrake ◴[] No.41892045{4}[source]
Somehow I doubt those savings would be passed along to the individual car buyer. Surely buying a car insured by the manufacturer would be much more expensive than buying the car plus your own individual insurance, because the car company would want to profit from both.
18. hilsdev ◴[] No.41892069{4}[source]
We should hold Pharma companies liable for every death. They make money off the success cases. Not doing so is another example of privatized profits and socialized risks/costs. Something like a program with reduced costs for those willing to sign away liability to help balance social good vs risk analysis
19. moralestapia ◴[] No.41892070{4}[source]
Read the site rules.

And also, of course some people would take that deal, and of course some others wouldn't. Your argument is moot.

20. thunky ◴[] No.41892279{4}[source]
You can go to prison or die for being a bad driver, yet people choose to drive.
replies(2): >>41892668 #>>41893006 #
21. thedougd ◴[] No.41892322{4}[source]
And it would be supplementary to the driver’s insurance, only covering incidents that happen while FSD is engaged. Arguably they would self insure and only purchase insurance for Tesla as a back stop to their liability, maybe through a reinsurance market.
22. viraptor ◴[] No.41892572{3}[source]
That's a dangerous line and I don't think it's correct. Software I write shouldn't be relied on in critical situations. If someone makes that decision then it's on them not on me.

The line should be where a person tells others that they can rely on the software with their lives - as in the integrator for the end product. Even if I was working on the software for self driving, the same thing would apply - if I wrote some alpha level stuff for the internal demonstration and some manager decided "good enough, ship it", they should be liable for that decision. (Because I wouldn't be able to stop them / may have already left by then)

replies(3): >>41892970 #>>41893594 #>>41895839 #
23. viraptor ◴[] No.41892592{5}[source]
> that says bugs are inevitable and there’s nothing you can do to prevent them

I don't think people believe this as such. It may be the short way to write it, but actually what devs mean is "bugs are inevitable at the funding/time available". I often say "bugs are inevitable" when it practice it means "you're not going to pay a team for formal specification, validated implementation and enough reliable hardware".

Which business will agree to making the process 5x longer and require extra people? Especially if they're not forced there by regulation or potential liability?

24. everforward ◴[] No.41892653{5}[source]
It is true of every field I can think of. Food gets salmonella and what not frequently. Surgeons forget sponges inside of people (and worse). Truckers run over cars. Manufacturers miss some failures in QA.

Literally everywhere else, we accept that the costs of 100% safety are just unreasonably high. People would rather have a mostly safe device for $1 than a definitely safe one for $5. No one wants to pay to have every head of lettuce tested for E Coli, or truckers to drive at 10mph so they can’t kill anyone.

Software isn’t different. For the vast majority of applications where the costs of failure are low to none, people want it to be free and rapidly iterated on even if it fails. No one wants to pay for a formally verified Facebook or DoorDash.

replies(1): >>41893620 #
25. ukuina ◴[] No.41892668{5}[source]
Systems evolve to handle such liability: Drivers pass theory and practical tests to get licensed to drive (and periodically thereafter), and an insurance framework that gauges your risk-level and charges you accordingly.
replies(2): >>41893635 #>>41894827 #
26. tiahura ◴[] No.41892829[source]
I think that’s implicit in the promise of the upcoming-any-year-now unattended full self driving.
27. presentation ◴[] No.41892970{4}[source]
To be fair maybe the software you write shouldn’t be relied on in critical situations but in this case the only place this software could be used in are critical situations
replies(1): >>41893226 #
28. _rm ◴[] No.41893006{5}[source]
Arguing for the sake of it; you wouldn't take that risk reward.

Most code has bugs from time to time even when highly skilled developers are being careful. None of them would drive if the fault rate was similar and the outcome was death.

replies(2): >>41894194 #>>41897174 #
29. viraptor ◴[] No.41893226{5}[source]
Ultimately - yes. But as I mentioned, the fact it's sold as ready for critical situations doesn't mean the developers thought/said it's ready.
replies(2): >>41893722 #>>41893726 #
30. ywvcbk ◴[] No.41893401[source]
Also I wouldn’t be surprised if any potential wrongful death lawsuits could cost Tesla several magnitudes more than the current average.
31. ywvcbk ◴[] No.41893427{3}[source]
How much would every death or severe injury caused by FSD cost Tesla? We probably won’t know anytime soon but since unlike anyone else they can afford to pay out virtually unlimited amounts and courts will presumably take that into account
32. ywvcbk ◴[] No.41893444{4}[source]
What if someone gets killed because of some clear bug/error and the jury decides to award 100s of millions just for that single ? I’m not sure it’s trivial to insurance companies to account for that sort of risk
replies(2): >>41894378 #>>41894784 #
33. ywvcbk ◴[] No.41893456{4}[source]
> criminally liable for every death.

The fact that people generally consume drugs voluntarily and make that decision after being informed about most of the known risks probably mitigates that to some extent. Being killed by someone else’s FSD car seems to be very different

replies(2): >>41893905 #>>41894860 #
34. ywvcbk ◴[] No.41893464{5}[source]
Punishing individual developers is of course absurd (unless intent can be proven) the company itself and the upper management on the hand? Would make perfect sense.
replies(1): >>41894921 #
35. kergonath ◴[] No.41893594{4}[source]
It’s not that complicated or outlandish. That’s how most engineering fields work. If a building collapses because of design flaws, then the builders and architects can be held responsible. Hell, if a car crashes because of a design or assembly flaw, the manufacturer is held responsible. Why should self-driving software be any different?

If the software is not reliable enough, then don’t use it in a context where it could kill people.

replies(1): >>41894185 #
36. kergonath ◴[] No.41893620{6}[source]
> Literally everywhere else, we accept that the costs of 100% safety are just unreasonably high.

Yes, but also in none of these situations would the consumer/customer/patient be held responsible. I don’t expect a system to be perfect, but I won’t accept any liability if it malfunctions as I use it the way it is intended. And even worse, I would not accept that the designers evade their responsibilities if it kills someone I know.

As the other poster said, I am happy to consider it safe enough the day the company accepts to own its issues and the associated responsibility.

> No one wants to pay for a formally verified Facebook or DoorDash.

This is untenable. Does nobody want a formally verified avionics system in their airliner, either?

replies(1): >>41895466 #
37. kergonath ◴[] No.41893635{6}[source]
Requiring formal licensing and possibly insurance for developers working on life-critical systems is not that outlandish. On the contrary, that is already the case in serious engineering fields.
38. elric ◴[] No.41893722{6}[source]
I think it should be fairly obvious that it's not the individual developers who are responsible/liable. In critical systems there is a whole chain of liability. That one guy in Nebraska who thanklessly maintains some open source lib that BigCorp is using in their car should obviously not be liable.
replies(1): >>41894847 #
39. gmueckl ◴[] No.41893726{6}[source]
But someone slapped that label on it and made a pinky promise that it's true. That person needs to accept liability if things go wrong. If person A is loud and clear that something isn't ready, but person B tells the customer otherwise, B is at fault.

Look, there are well established procedures in a lot of industries where products are relied on to keep people safe. They all require quite rigorous development and certification processes and sneaking untested alpha quality software through such a process would be actively malicious and quite possibly criminal in and of itself, at least in some industries.

replies(1): >>41893832 #
40. tsimionescu ◴[] No.41893804{5}[source]
> And why don’t they spend 90% of their time testing and verifying and proving their software has no defects? Because defects are inevitable and they’re not held accountable for them!

For a huge part of the industry, the reason is entirely different. It is because software that mostly works today but has defects is much more valuable than software that always works and has no defects 10 years from now. Extremely well informed business customers will pay for delivering a buggy feature today rather than wait two more months for a comprehensively tested feature. This is the reality of the majority of the industry: consumers care little about bugs (below some defect rate) and care far more about timeliness.

This of course doesn't apply to critical systems like automatic drivers or medical devices. But the vast majority of the industry is not building these types of systems.

41. viraptor ◴[] No.41893832{7}[source]
This is the beginning of the thread https://news.ycombinator.com/item?id=41891164

You're in violent agreement with me ;)

replies(1): >>41893935 #
42. izacus ◴[] No.41893851{4}[source]
The ones that are identified as making decisions leading to death, yes.

It's completely normal in other fields where engineers build systems that can kill.

replies(2): >>41894849 #>>41901038 #
43. sokoloff ◴[] No.41893905{5}[source]
Imagine that in 2031, FSD cars could exactly halve all aspects of auto crashes (minor, major, single car, multi car, vs pedestrian, fatal/non, etc.)

Would you want FSD software to be developed or not? If you do, do you think holding devs or companies criminally liable for half of all crashes is the best way to ensure that progress happens?

replies(2): >>41894272 #>>41895047 #
44. latexr ◴[] No.41893935{8}[source]
No, the beginning of the thread is earlier. And with that context it seems clear to me that the “you” in the post you linked means “the company”, not “the individual software developer”. No one else in your replies seems confused by that, we all understand self-driving software wasn’t written by a single person that has ultimate decision power within a company.
replies(1): >>41894186 #
45. moooo99 ◴[] No.41894020{3}[source]
I think it is a matter of perceived risk.

Realistically speaking, nuclear power is pretty safe. In the history of nuclear power, there were two major incidents. Considering the number of nuclear power plants around the planet, that is pretty good. However, as those two accidents demonstrated, the potential fallout of those incidents is pretty severe and widespread. I think this massively contributes to the perceived risks. The warnings towards the public were pretty clear. I remember my mom telling stories from the time the Chernobyl incident became known to the public and people became worried about the produce they usually had from their gardens. Meanwhile, everything that has been done to address the hazards of fossil based power generation is pretty much happening behind the scenes.

With coal and natural gas, it seems like people perceive the risks as more abstract. The radioactive emissions of coal power plants have been known for a while and the (potential) dangers of fine particulate matters resulting from combustion are somewhat well known nowadays as well. However, the effects of those danger seem much more abstract and delayed, leading people to not be as worried about it. It also shows on a smaller, more individual scale: people still buy ICE cars at large and install gas stoves into their houses despite induction being readily available and at times even cheaper.

replies(2): >>41894445 #>>41894935 #
46. krisoft ◴[] No.41894185{5}[source]
I think the example here is that the designer draws a bridge for a railway model, and someone decides to use the same design and sends real locomotives across it. Is the original designer (who neither intended nor could have foreseen this) liable in your understanding?
replies(3): >>41894354 #>>41894366 #>>41894816 #
47. viraptor ◴[] No.41894186{9}[source]
If the message said "you release software", or "approve" or "produce", or something like that, sure. But it said "you write software" - and I don't think that can apply to a company, because writing is what individuals do. But yeah, maybe that's not what the author meant.
replies(1): >>41894422 #
48. notahacker ◴[] No.41894194{6}[source]
Or to put even more straightforwardly: people who choose to drive rarely expect to drive more than a few 10s of k per year. People who choose to write autonomous software's lines of code potentially drive a billion miles per year, experiencing a lot more edge cases they are expected to handle in a non-dangerous manner, and have to handle them via advance planning and interactions with a lot of other people's code.

The only practical way around this which permits autonomous vehicles (which are apparently dependent on much more complex and intractable codebases than, say, avionics) is a much higher threshold of criminal responsibility than the "the serious consequences resulted from the one-off execution of an dangerous manoeuvre which couldn't be justified in context" which sends human drivers to jail. And of course that double standard will be problematic if "willingness to accept liability" is the only safety threshold.

49. mrjin ◴[] No.41894269[source]
Even if it does, can it resurrect the deceased?
replies(1): >>41894616 #
50. blackoil ◴[] No.41894272{6}[source]
Say cars have near 0 casualty in northern hemisphere but occasionally fails for cars driving topsy turvy in south. If company knew about it and chooses to ignore it because of profits, yes they should be charged criminally.
51. ndsipa_pomu ◴[] No.41894342[source]
> As soon as it’s good enough for Tesla to accept liability for accidents.

That makes a lot of sense and not just from a selfish point of view. When a person drives a vehicle, then the person is held responsible for how the vehicle behaves on the roads, so it's logical that when a machine drives a vehicle that the machine's manufacturer/designer is held responsible.

It's a complete con that Tesla is promoting their autonomous driving, but also having their vehicles suddenly switch to non-autonomous driving which they claim moves the responsibility to the human in the driver seat. Presumably, the idea is that the human should have been watching and approving everything that the vehicle has done up to that point.

replies(2): >>41894666 #>>41894794 #
52. ndsipa_pomu ◴[] No.41894354{6}[source]
That's a ridiculous argument.

If a construction firm takes an arbitrary design and then tries to build it in a totally different environment and for a different purpose, then the construction firm is liable, not the original designer. It'd be like Boeing taking a child's paper aeroplane design and making a passenger jet out of it and then blaming the child when it inevitably fails.

replies(3): >>41894574 #>>41894653 #>>41895101 #
53. kergonath ◴[] No.41894366{6}[source]
Someone, at some point signed off on this being released. Not thinking things through seriously is not an excuse to sell defective cars.
54. ndsipa_pomu ◴[] No.41894378{5}[source]
Not trivial, but that is exactly the kind of thing that successful insurance companies factor into their premiums, or specifically exclude those scenarios (e.g. not covering war zones for house insurance).
55. ndsipa_pomu ◴[] No.41894412[source]
That's not a workable idea as it'd just encourage corporations to obfuscate the ownership of the plant (e.g. shell companies) and drastically underestimate the actual risks of catastrophes. Ultimately, the government will be left holding the bill for nuclear catastrophes, so it's better to just recognise that and get the government to regulate the energy companies.
replies(1): >>41894859 #
56. latexr ◴[] No.41894422{10}[source]
> and I don't think that can apply to a company, because writing is what individuals do.

By that token, no action could ever apply to a company—including approving, producing, or releasing—since it is a legal entity, a concept, not a physical thing. For all those actions there was a person actually doing it in the name of the company.

It’s perfectly normal to say, for example, “GenericCorp wrote a press-release about their new product”.

57. pyrale ◴[] No.41894445{4}[source]
> However, the effects of those danger seem much more abstract and delayed, leading people to not be as worried about it.

Climate change is very visible in the present day to me. People are protesting about it frequently enough that it's hard to claim they are not worried.

replies(1): >>41895351 #
58. ◴[] No.41894574{7}[source]
59. mensetmanusman ◴[] No.41894610{3}[source]
Software requires hardware that can bit flip with gamma rays.
replies(3): >>41894643 #>>41894885 #>>41894887 #
60. LadyCailin ◴[] No.41894616[source]
But people driving manually kill people all the time too. The bar for self driving isn’t «does it never kill anyone», it’s «does it kill people less than manual driving». We’re not there yet, and Tesla’s «FSD» is marketing bullshit, but we certainly will be there one day, and at that point, we need to understand what we as a society will do when a self driving car kills someone. It’s not obvious what the best solution is there, and we need to continue to have societal discussions to hash that out, but the correct solution definitely isn’t «don’t use self driving».
replies(2): >>41894637 #>>41895419 #
61. amelius ◴[] No.41894637{3}[source]
No, because every driver thinks they are better than average.

So nobody will accept it.

replies(3): >>41894755 #>>41894992 #>>41901080 #
62. wongarsu ◴[] No.41894653{7}[source]
Or alternatively, if Boeing uses wood screws to attach an airplane door and the screw fails that's on Boeing, not the airline, pilot or screw manufacturer. But if it's sold as aerospace-grade attachment bolt with attachments for safety wire and a spec sheet that suggests the required loads are within design parameters then it's the bolt manufacturers fault when it fails, and they might have to answer for any deaths resulting from that. Unless Boeing knew or should have known that the bolts weren't actually as good as claimed, then the buck passes back to them

Of course that's wildly oversimplifying and multiple entities can be at fault at once. My point is that these are normal things considered in regular engineering and manufacturing

63. andrewaylett ◴[] No.41894666[source]
The responsibility doesn't shift, it always lies with the human. One problem is that humans are notoriously poor at maintaining attention when supervising automation

Until the car is ready to take over as legal driver, it's foolish to set the human driver up for failure in the way that Tesla (and the humans driving Tesla cars) do.

replies(2): >>41894801 #>>41896371 #
64. renegade-otter ◴[] No.41894710[source]
In the United States? Come on. Boeing executives are not in jail - they are getting bonuses.
replies(1): >>41894852 #
65. the8472 ◴[] No.41894755{4}[source]
I expect insurance to figure out the relative risks and put a price sticker on that decision.
66. jefftk ◴[] No.41894760[source]
Note that Mercedes does take liability for accidents with their (very limited level) level 3 system: https://www.theverge.com/2023/9/27/23892154/mercedes-benz-dr...
replies(2): >>41894805 #>>41899207 #
67. kalenx ◴[] No.41894784{5}[source]
It is trivial and they've done it for ages. It's called reinsurance.

Basically (_very_ basically, there's more to it) the insurance company insures itself against large claims.

replies(1): >>41895090 #
68. f1shy ◴[] No.41894794[source]
>> When a person drives a vehicle, then the person is held responsible for how the vehicle behaves on the roads, so it's logical that when a machine drives a vehicle that the machine's manufacturer/designer is held responsible.

Never really understood the supposed dilemma. What happens when the brakes fail because of bad quality?

replies(2): >>41894839 #>>41895747 #
69. f1shy ◴[] No.41894801{3}[source]
What?! So if there is a failure and the car goes full throttle (no autonomous car) it is my responsibility?! You are pretty wrong!!!
replies(3): >>41895481 #>>41895527 #>>41906386 #
70. f1shy ◴[] No.41894805[source]
Yes. That is the only way. That being said, I want to see the first incidents, and how are they resolved.
71. ekianjo ◴[] No.41894812{3}[source]
How is that working with Boeing?
replies(1): >>41895001 #
72. f1shy ◴[] No.41894816{6}[source]
Are you serious?! You must be trolling!
replies(1): >>41895151 #
73. ekianjo ◴[] No.41894827{6}[source]
And yet tens of thousands of people die on the roads right now every year. Working well?
74. arzig ◴[] No.41894839{3}[source]
Then this would be manufacturing liability because they are not fit for purpose.
75. f1shy ◴[] No.41894847{7}[source]
It depends. If you do bad sw and skip reviews and processes, you may be liable. Even if you are told to do something, if you know is wrong, you should say it. Right now I’m in middle of s*t because of I spoked up.
replies(1): >>41896160 #
76. A4ET8a8uTh0 ◴[] No.41894849{5}[source]
Pretty much. Fuck. I just watched higher ups sign off on a project I know for a fact has defects all over the place going into production despite our very explicit: don't do it ( not quite Tesla level consequences, but still resulting in real issues for real people ). The sooner we can start having people in jail for knowingly approving half-baked software, the sooner it will improve.
replies(1): >>41895257 #
77. f1shy ◴[] No.41894852{3}[source]
But some little boy down the line will pay for it. Look for Eschede ICE accident.
replies(1): >>41894991 #
78. f1shy ◴[] No.41894859{3}[source]
The problem I see there is that if “corporations are responsible” then no one is. That is, no real person has the responsibility, and acts accordingly.
79. ekianjo ◴[] No.41894860{5}[source]
> make that decision after being informed about most of the known risks

Like for the COVID-19 vaccines? Experimental yet given to billions without ever showing them a consent form.

replies(1): >>41895076 #
80. aaronmdjones ◴[] No.41894885{4}[source]
Which is why hardware used to run safety-critical software is made redundant.

Take the Boeing 777 Primary Flight Computer for example. This is a fully digital fly-by-wire aircraft. There are 3 separate racks of equipment housing identical flight computers; 2 in the avionics bay underneath the flight deck, 1 in the aft cargo section. Each flight computer has 3 separate processors, supporting 2 dissimilar instruction set architectures, running the same software built by 3 separate compilers. Each flight computer captures instances of the software not agreeing about an action to be undertaken and wins by majority vote. The processor that makes these decisions is different in each flight computer.

The power systems that provide each flight computer are also fully redundant; each computer gets power from a power supply assembly, which receives 2 power feeds from 3 separate power supplies; no 2 power supply assemblies share the same 2 sources of power. 2 of the 3 power systems (L engine generator, R engine generator, and the hot battery bus) would have to fail and the APU would have to be unavailable in order to knock out 1 of the 3 computers.

This system has never failed in 30 years of service. There's still a primary flight computer disconnect switch on the overhead panel in the cockpit, taking the software out of the loop, to logically connect all of your control inputs to the flight surface actuators. I'm not aware of it ever being used (edit: in a commercial flight).

replies(1): >>41895814 #
81. chgs ◴[] No.41894887{4}[source]
You can control for that. Multiple machines doing is rival calculations for example
82. chgs ◴[] No.41894907{4}[source]
Need far more regulation of the software industry, far too many people working in it fail to understand the scope of what they do.

Civil engineer kills someone with a bad building, jail. Surgeon removes the wrong lung, jail. Computer programmer kills someone, “oh well it’s your own fault”.

replies(2): >>41895200 #>>41903272 #
83. chgs ◴[] No.41894921{6}[source]
You have one person in that RACI accountable box. That’s the engineer signing it off as fit. They are held accountable, including with jail if required.
84. brightball ◴[] No.41894935{4}[source]
During power outages, having natural gas in your home is a huge benefit. Many in my area just experienced it with Helene.

You can still cook. You can still get hot water. If you have gas logs you still have a heat source in the winter too.

These trade offs are far more important to a lot of people.

replies(1): >>41895342 #
85. renegade-otter ◴[] No.41894991{4}[source]
There are many examples.

The Koch brothers, famous "anti-regulatory state" warriors, have fought oversight so hard that their gas pipelines were allowed to be barely intact.

Two teens get into a truck, turn the ignition key - and the air explodes:

https://www.southcoasttoday.com/story/news/nation-world/1996...

Does anyone go to jail? F*K NO.

replies(1): >>41895304 #
86. A4ET8a8uTh0 ◴[] No.41894992{4}[source]
Assuming I understand the argument flow correctly, I think I disagree. If there is one thing that the past few decades have confirmed quite conclusively, it is that people will trade a lot of control and sense away in the name of convenience. The moment FSD reaches that sweet spot of 'take me home -- I am too drunk to drive' of reliability, I think it would be accepted; maybe even required by law. It does not seem there.
87. mlinhares ◴[] No.41895001{4}[source]
People often forget corporations don’t go to jail. Murder when you’re not a person ends up with a slap.
88. ywvcbk ◴[] No.41895047{6}[source]
From a utilitarian perspective sure, you might be right but how do you exempt those companies from civil liability and make it impossible for victims/their families to sue the manufacturer? Might be legally tricky (driver/owner can explicitly/implicitly agree with the EULA or other agreements, imposing that on third parties wouldn’t be right).
replies(1): >>41895366 #
89. ywvcbk ◴[] No.41895076{6}[source]
Yes, but worse. Nobody physically forced anyone to get vaccinated so you still had some choice. Of course legally banning individuals from using public roads or sidewalks unless they give up their right to sue Tesla/etc. might be an option.
90. ywvcbk ◴[] No.41895090{6}[source]
I’m not sure Boeing etc. could have insured any liability risk resulting from engineering/design flaws in their vehicles?
91. bossyTeacher ◴[] No.41895100{3}[source]
Doesn't seem to happen in the medical and airplane industries, otherwise, Boeing would most likely not exist as a company anymore.
replies(1): >>41895177 #
92. krisoft ◴[] No.41895101{7}[source]
> That's a ridiculous argument.

Not making an argument. Asking a clarifying question about someone else’s.

> It'd be like Boeing taking a child's paper aeroplane design and making a passenger jet out of it and then blaming the child when it inevitably fails.

Yes exactly. You are using the same example I used to say the same thing. So which part of my message was ridiculous?

replies(1): >>41895440 #
93. krisoft ◴[] No.41895151{7}[source]
I assure you I am not trolling. You appear to have misread my message.

Take a deep breath. Read my message one more time carefully. Notice the question mark at the end of the last sentence. Think about it. If after that you still think I’m trolling you or anyone else I will be here and happy to respond to your further questions.

94. jsvlrtmred ◴[] No.41895177{4}[source]
Perhaps one can debate whether it happens often enough or severely enough, but it certainly happens. For example, and only the first one to come to mind - the president of PIP went to jail.
95. caddemon ◴[] No.41895200{5}[source]
I've never heard of a surgeon going to jail over a genuine mistake even if it did kill someone. I'm also not sure what that would accomplish - take away their license to practice medicine sure, but they're not a threat to society more broadly.
96. IX-103 ◴[] No.41895257{6}[source]
Should we require Professional Engineers to sign off on such projects the same way they are required to for other safety critical infrastructure (like bridges and dams)? The Professional Engineer that signed off is liable for defects in the design. (Though, of course, if the design is not followed then liability can shift back to the company that built it)
replies(1): >>41898367 #
97. IX-103 ◴[] No.41895304{5}[source]
To be fair, the teens knew about the gas leak and started the truck in an attempt to get away. Gas leaks like that shouldn't happen easily, but people near pipelines like that should also be made aware of the risks of gas leaks, as some leaks are inevitable.
replies(1): >>41897806 #
98. moooo99 ◴[] No.41895342{5}[source]
Granted, that is a valid concern if power outages are more frequent in your area. I have never experienced a power outage personally, so that is nothing I ever thought of. However, I feel like with solar power and battery storage systems becoming increasingly widespread, this won't be a major concern for much longer
replies(1): >>41898444 #
99. moooo99 ◴[] No.41895351{5}[source]
Climate change is certainly visible, although the extend to which areas are affected varies wildly. However, there are still shockingly many people who have a hard time attributing ever increasing natural disasters and more extreme weather patterns to climate change.
100. Majromax ◴[] No.41895366{7}[source]
> how do you exempt those companies from civil liability and make it impossible for victims/their families to sue the manufacturer?

I don't think anyone in this thread has talked about an exemption from civil liability (sue for money), just criminal liability (go to jail).

Civil liability is the far less controversial issue because it's transferred all the time: governments even mandate that drivers carry insurance for this purpose.

With civil liability transfer, imperfect FSD can still make economic sense. Just as an insurance company needs to collect enough premium to pay claims, the FSD manufacturer would need to reserve enough revenue to pay its expected claims. In this case, FSD doesn't even need to be better than humans to make economic sense, in the same way that bad drivers can still buy (expensive) insurance.

replies(2): >>41895467 #>>41895767 #
101. Majromax ◴[] No.41895419{3}[source]
> The bar for self driving isn’t «does it never kill anyone», it’s «does it kill people less than manual driving».

Socially, that's not quite the standard. As a society, we're at ease with auto fatalities because there's often Someone To Blame. "Alcohol was involved in the incident," a report might say, and we're more comfortable even though nobody's been brought back to life. Alternatively, "he was asking for it, walking at night in dark clothing, nobody could have seen him."

This is an emotional standard that speaks to us as human, story-telling creatures that look for order in the universe, but this is not a proper actuarial standard. We might need FSD to be manifestly safer than even the best human drivers before we're comfortable with its universal use.

replies(1): >>41901703 #
102. ndsipa_pomu ◴[] No.41895440{8}[source]
If it's not an argument, then you're just misrepresenting your parent poster's comment by introducing a scenario that never happens.

If you didn't intend your comment as a criticism, then you phrased it poorly. Do you actually believe that your scenario happens in reality?

replies(2): >>41895781 #>>41897990 #
103. everforward ◴[] No.41895466{7}[source]
You could be held liable if it impacts someone else. A restaurant serving improperly cooked chicken that gives people E Coli is liable. Private citizens may not have that duty, I’m not sure.

You would likely also be liable if you overloaded an electrical cable, causing a fire that killed someone.

“Using it in the way it was intended” is largely circular reasoning; of course it wasn’t intended to hurt anyone, so any usage that does hurt someone was clearly unintended. People frequently harm each other by misusing items in ways they didn’t realize were misuses.

> This is untenable. Does nobody want a formally verified avionics system in their airliner, either?

Not for the price it would cost. Airbus is the pioneer here, and even they apply formal verification sparingly. Here’s a paper from a few years ago about it, and how it’s untenable to formally verify the whole thing: https://www.di.ens.fr/~delmas/papers/fm09.pdf

Software development effort generally tends to scale superlinearly with complexity. I am not an expert, but the impression I get is that formal verification grows exponentially with complexity to the point that it is untenable for most things beyond research and fairly simple problems. It is a huge pain in the ass to do something like putting time bounds around reading a config file.

IO also sucks in formal verification from what I hear, and that’s like 80% of what a plane does. Read these 300 signals, do some standard math, output new signals to controls.

These things are much easier to do with tests, but tests only check for scenarios you’ve thought of already

replies(1): >>41898516 #
104. ywvcbk ◴[] No.41895467{8}[source]
> just criminal liability (go to jail).

That just seems like a theoretical possibility (even if that). I don’t see how any engineer or even someone in management could go to jail unless intent or gross negligence can be proven.

> drivers carry insurance for this purpose.

The mandatory limit is extremely low in many US states.

> expected claims

That seems like the problem. It might take a while until we reach an equilibrium of some sort.

> that bad drivers can still buy

That’s still capped by the amount of coverage + total assets held by that bad driver. In Tesl’s case there is no real limit (without legislation/established precedent). Juries/courts would likely be influenced by that fact as well.

105. kgermino ◴[] No.41895481{4}[source]
You are responsible (Legally, contractually, morally) for supervising FSD today. If the car decided to stomp on the throttle you are expected to be ready to hit the brakes.

The whole point is that is somewhat of an unreasonable expectation but it’s what Tesla expects you to do today

replies(2): >>41896164 #>>41896283 #
106. xondono ◴[] No.41895527{4}[source]
Autopilot, FSD, etc.. are all legally classified as ADAS, so it’s different from e.g. your car not responding to controls.

The liability lies with the driver, and all Tesla needs to prove is that input from the driver will override any decision made by the ADAS.

107. renewiltord ◴[] No.41895652{3}[source]
Sure, we can have a carbon tax on everything. That's fine. And then the nuclear plant has to pay for a Pripyat-sized exclusion zone around it. Just like the guy said about Tesla. All fair.
108. hibikir ◴[] No.41895710{3}[source]
Remember that this is neural networks doing the driving, more than old expert systems: What makes a crash happen is a network that fails to read an image correctly, or a network that fails to capture what is going on when melding input from different sensors.

So the blame won't be on a guy who got an if statement backwards, but signing off on stopping training, failing to have certain kinds of pictures in the set, or other similar, higher order problem. Blame will be incredibly nebulous.

replies(1): >>41902224 #
109. ndsipa_pomu ◴[] No.41895747{3}[source]
> What happens when the brakes fail because of bad quality?

Depends on the root cause of the failure. Manufacturing faults would put the liability on the manufacturer; installation mistakes would put the liability on the mechanic; using them past their useful life would put the liability on the owner for not maintaining them in working order.

110. DennisP ◴[] No.41895767{8}[source]
In fact, if you buy your insurance from Tesla, you effectively do put civil responsibility for FSD back in their hands.
111. lcnPylGDnU4H9OF ◴[] No.41895781{9}[source]
It was not a misrepresentation of anything. They were just restating the worry that was stated in the GP comment. https://news.ycombinator.com/item?id=41892572

And the only reason the commenter I linked to had that response is because its parent comment was slightly careless in its phrasing. Probably just change “write” to “deploy” to capture the intended meaning.

112. mensetmanusman ◴[] No.41895814{5}[source]
You can’t guarantee the hardware was properly built.
replies(1): >>41895873 #
113. sigh_again ◴[] No.41895839{4}[source]
>Software I write shouldn't be relied on in critical situations.

Then don't write software to be used in things that are literally always critical situations, like cars.

114. aaronmdjones ◴[] No.41895873{6}[source]
Unless Intel, Motorola, and AMD all conspire to give you a faulty processor, you will get a working primary flight computer.

Besides, this is what flight testing is for. Aviation certification authorities don't let an aircraft serve passengers unless you can demonstrate that all of its safety-critical systems work properly and that it performs as described.

I find it hard to believe that automotive works much differently in this regard, which is what things like crumple zone crash tests are for.

115. Filligree ◴[] No.41896160{8}[source]
> Right now I’m in middle of s*t because of I spoked up.

And you believe that, despite experiencing what happens if you speak up?

We shouldn’t simultaneously require people to take heroic responsibility, while also leaving them high and dry if they do.

replies(1): >>41896521 #
116. f1shy ◴[] No.41896164{5}[source]
My example was clear about NOT about autonomous driving. Because the previous comment seems to imply for everything you are responsible
117. theptip ◴[] No.41896173[source]
Presumably that is exactly when their taxi service rolls out?

While this has a dramatic rhetorical flourish, I don’t think it’s a good proxy. Even if it was safer, it would be an unnecessarily high burden to clear. You’d be effectively writing a free insurance policy which is obviously not free.

Just look at total accidents / deaths per mile driven, it’s the obvious and standard metric for measuring car safety. (You need to be careful to not stop the clock as soon as the system disengages of course. )

118. FireBeyond ◴[] No.41896283{5}[source]
> If the car decided to stomp on the throttle you are expected to be ready to hit the brakes.

Didn't Tesla have an issue a couple of years ago where pressing the brake did not disengage any throttle? i.e. if the car has a bug and puts throttle to 100% and you stand on the brake, the car should say "cut throttle to 0", but instead, you just had 100% throttle, 100% brake?

replies(1): >>41897359 #
119. mannykannot ◴[] No.41896371{3}[source]
> The responsibility doesn't shift, it always lies with the human.

Indeed, and that goes for the person or persons who say that the products they sell are safe when used in a certain way.

120. f1shy ◴[] No.41896521{9}[source]
I do believe I am responsible. I recognize I’am now in a position that I can speak without fear. If I get fired I would make a party tbh.
121. sashank_1509 ◴[] No.41896899{3}[source]
Do we send Boeing engineers to jail when their plane crashes?

Intention matters when passing crime judgement. If a mother causes the death of her baby due to some poor decision (say feed her something contaminated), no one proposes or tries to jail the mother, because they know the intention was the opposite.

replies(1): >>41901773 #
122. lowbloodsugar ◴[] No.41896926[source]
And corporations are people now, so Tesla can go to jail.
123. 7sidedmarble ◴[] No.41897174{6}[source]
I don't think anyone's seriously suggesting people be held accountable for bugs which are ultimately accidents. But if you knowingly sign off on, oversea, or are otherwise directly responsible for the construction of software that you know has a good chance of killing people, then yes, there should be consequences for that.
replies(1): >>41903564 #
124. blackeyeblitzar ◴[] No.41897359{6}[source]
If it did, it wouldn’t matter. Brakes are required to be stronger than engines.
replies(1): >>41897796 #
125. FireBeyond ◴[] No.41897796{7}[source]
That makes no sense. Yes, they are. But brakes are going to be more reactive and performant with the throttle at 0 than 100.

You can't imagine that the stopping distances will be the same.

126. 8note ◴[] No.41897806{6}[source]
As an alternative though, the company also failed at handling that the gas leak started. They could have had people all over the place guiding people out and away from the leak safely, and keeping the public away while the leak is fixed.

Or, they could buy sufficient buffer land around the pipeline such that the gas leak will be found and stopped before it could explode down the road

127. krisoft ◴[] No.41897990{9}[source]
> you're just misrepresenting your parent poster's comment

I did not represent or misrepresent anything. I have asked a question to better understand their thinking.

> If you didn't intend your comment as a criticism, then you phrased it poorly.

Quite probably. I will have to meditate on it.

> Do you actually believe that your scenario happens in reality?

With railway bridges? Never. It would ring alarm bells for everyone from the fabricators to the locomotive engineer.

With software? All the time. Someone publishes some open source code, someone else at a corporation bolts the open source code into some application and now the former “toy train bridge” is a loadbearing key-component of something the original developer could never imagine nor plan for.

This is not theoretical. Very often I’m the one doing the bolting.

And to be clear: my opinion is that the liability should fall with whoever integrated the code and certified it to be fit for some safety critical purpose. As an example if you publish leftpad and i put it into a train brake controller it is my job to make sure it is doing the right thing. If the train crashes you as the author of leftpad bear no responsibility but me as the manufacturer of discount train brakes do.

128. A4ET8a8uTh0 ◴[] No.41898367{7}[source]
I hesitate, because I shudder at government deciding which algorithm is best for a given scenario ( because that is effectively is where it would go ). Maybe the distinction is, the moment money changes hands based on product?

I am not an engineer, but I have watched clearly bad decisions take place from technical perspective so that a person with title that went to their head and a bonus that is not aligned with right incentives mess things up for us. Maybe some proffesionalization of software engineering is in order.

replies(1): >>41902255 #
129. brightball ◴[] No.41898444{6}[source]
They aren’t frequent but in the last 15-16 years there have been 2 outages that lasted almost 2 weeks in some areas around here. The first one was in the winter and the only gas appliance I had was a set of gas logs in the den.

It heated my whole house and we used a pan to cook over it. When we moved the first thing I did was install gas logs, gas stove and a gas water heater.

It’s nice to have options and backup plans. That’s one of the reasons I was a huge fan of the Chevy Volt when it first came out. I could easily take it on a long trip but still averaged 130mpg over 3 years (twice). Now I’ve got a Tesla and when there are fuel shortages it’s also really nice.

A friend of ours owns a cybertruck and was without power for 9 days, but just powered the whole house with the cybertruck. Every couple of days he’d drive to a supercharger station to recharge.

130. kergonath ◴[] No.41898516{8}[source]
> You could be held liable if it impacts someone else. A restaurant serving improperly cooked chicken that gives people E Coli is liable. Private citizens may not have that duty, I’m not sure. > You would likely also be liable if you overloaded an electrical cable, causing a fire that killed someone.

Right. But neither of these examples are following guidelines or proper use. If I turn the car into people on the pavement, I am responsible. If the steering wheel breaks and the car does it, then the manufacturer is responsible (or the mechanic, if the steering wheel was changed). The question at hand is whose responsibility it is if the car’s software does it.

> “Using it in the way it was intended” is largely circular reasoning; of course it wasn’t intended to hurt anyone, so any usage that does hurt someone was clearly unintended.

This is puzzling. You seem to be conflating use and consequences and I am not quite sure how you read that in what I wrote. Using a device normally should not make it kill people, I guess at least we can agree on that. Therefore, if a device kills people, then it is either improper use (and the fault of the user), or a defective device, at which point it is the fault of the designer or manufacturer (or whoever did the maintenance, as the case might be, but that’s irrelevant in this case).

Each device has a manual and a bunch of regulations about its expected behaviour and standard operating procedures. There is nothing circular about it.

> Not for the price it would cost.

Ok, if you want to go full pedantic, note that I wrote “want”, not “expect”.

131. iknowstuff ◴[] No.41899207[source]
its pathetic. <40mph following a vehicle directly ahead. basically only usable in stop and go traffic

https://www.notebookcheck.net/Tesla-vs-Mercedes-self-driving...

replies(1): >>41899255 #
132. jefftk ◴[] No.41899255{3}[source]
The Mercedes system is definitely, as I said, very limited. But within it's operating conditions the Mercedes system is much more useful: you can safely and legally read, work, or watch a movie while in the driver's seat, literally not paying any attention to the road.
133. Dylan16807 ◴[] No.41901038{5}[source]
That's liability for defective design, not any time it fails as suggested above.
134. Dylan16807 ◴[] No.41901080{4}[source]
The level where someone personally uses it and the level where they accept it being on the road are different. Beating the average driver is all about the latter.

Also I will happily use self driving that matches the median driver in safety.

135. LadyCailin ◴[] No.41901703{4}[source]
That may be true, but I think I personally would find it extremely hard to argue against when the numbers are clearly showing that it’s safer. I think once the numbers are unambiguously showing that autopilots are safer, it will be super hard for people to argue against it. Of course there is a huge intermediate state where the numbers aren’t clear (or at least not clear to the average person), and during that stage, emotions may rule the debate. But if the underlying data is there, I’m certain car companies can change the narrative - just look at how much American hates public transit and jaywalkers.
136. davkan ◴[] No.41901773{4}[source]
This is why we have criminal negligence. Did the mother open a sealed package from the grocery store or did she find an open one on the ground?

Harder to apply to software but maybe there should be a some legal liability involved when a sysadmin uses admin/admin and health information is leaked.

Some employees should be absolutely in jail from boeing regarding the MCAS system and the hundreds of people who died as a result. But the actions there go beyond negligence anyway.

137. snovv_crash ◴[] No.41902224{4}[source]
This is the difference between a Professional Engineer (ie. the protected term) and everyone else who calls themselves engineers. They can put their signature on a system that would then hold them criminally liable if it fails.

Bridges, elevators, buildings, ski lifts etc. all require a professional engineer to sign off on them before they can be built. Maybe self driving cars need the same treatment.

138. snovv_crash ◴[] No.41902255{8}[source]
This isn't a matter of the government saying what you need to do. This is a matter of being held criminally liable if people get hurt.
replies(1): >>41903713 #
139. _rm ◴[] No.41903272{5}[source]
You made all that up out of nothing. They'd only go to jail if it was intentional.

The only case where a computer programmer "kills someone" is where he hacks into a system and interferes with it in a way that foreseeably leads to someone's death.

Otherwise, the user voluntarily assumed the risk.

Frankly if someone lets a computer drive their car, given their own ample experiences of computers "crashing", it's basically a form of attempted suicide.

140. thunky ◴[] No.41903564{7}[source]
Exactly. Just like most car accidents don't result in prison or death. But negligence or recklessness can do it.
141. A4ET8a8uTh0 ◴[] No.41903713{9}[source]
You are only technically correct. And even then, in terms of civics, by having people held criminally liable government is telling you what to do ( or technically not do ). Note that no other body can ( legally ) do it. In fact, false imprisonment is in itself a punishable offense, but I digress..

Now, we could argue over whether that is/should/could/would be the law of the land, but have you considered how it would be enforced?

I mean, I can tell you first hand what it looks like, when government gives you a vague law for an industry to figure out and an enforcement agency with a broad mandate.

That said, I may have exaggerated a little bit on the algo choice. I was shooting for ghoulish overkill.

replies(1): >>41905305 #
142. freejazz ◴[] No.41905305{10}[source]
> You are only technically correct

You clearly have no idea how civil liability works. At all.

replies(1): >>41905548 #
143. A4ET8a8uTh0 ◴[] No.41905548{11}[source]
I am here to learn. You can help me by educating me. I do mean it sincerely. If you think you have a grasp on the subject, I think HN as a whole could benefit from your expertise.
replies(1): >>41905564 #
144. freejazz ◴[] No.41905564{12}[source]
Civil liability isn't determined by the "gov't" it's determined by a jury of your peers. More interesting to me is how you came to the impression that you had any idea what you were talking about to the point you felt justified in making your post.
replies(1): >>41905827 #
145. A4ET8a8uTh0 ◴[] No.41905827{13}[source]
My friend. Thank you. It is not often I get to be myself lately. Allow me to retort in kind.

Your original response to my response was in turn a response to the following sentence by "snovv_crash":

"This isn't a matter of the government saying what you need to do. This is a matter of being held criminally liable if people get hurt."

I do want to point out that from the beginning the narrow scope of this argument defined the type of liability as criminal and not civil as your post suggested. In other words, your whole point kinda falls apart as I was not talking about civil liability, but about the connection of civics and government's ( or society's depending on your philosophical bent ) monopoly on violence.

It is possible that the word civic threw you off, but I was merely referring to the study of the rights, duties, and obligations of citizens in a society. Surely, you would agree that writing code that kills people would be under the purview of the rights, duties and obligations of individuals in a society?

In either case, I am not sure what are you arguing for here, It is not just that you are wrong, but you seem to be oddly focused on trying to .. not even sure. Maybe I should ask you instead.

<<More interesting to me is how you came to the impression that you had any idea what you were talking about to the point you felt justified in making your post.

Yes, good question. Now that I replied I feel it would not be a bad idea ( edit: for you ) to present why you feel ( and I use that verb consciously ) you can just throw salad willy-nilly not only with confidence, but, clearly, justification worthy of a justicar.

tldr: You are wrong, but can you even accept that you are wrong.. now that will be an interesting thing to see.

<< that you had any idea

I am a guy on the internet man. No one has any idea about anything. Cheer up:D

replies(1): >>41906548 #
146. andrewaylett ◴[] No.41906386{4}[source]
The point at which we decide that a defect is serious enough to transfer liability is quite case-dependent. If you knew that the throttle was glitchy but hadn't done anything to fix it, yes. If it affected every car from the manufacturer, it's obviously their fault -- but if you ignore the recall then it might be your fault again?

In this case, the behaviour of the system and the responsibility of the driver is well-established. I'd actually quite like it if Tesla were held responsible for their software, but they somehow continue to skirt the line wherein they require the driver to retain vigilance and any system failures are therefore the (legal) fault of the human not the car despite advertising it as "Full Self Driving".

replies(1): >>41906722 #
147. freejazz ◴[] No.41906548{14}[source]
In a criminal court, guilt (not liability) is also determined by a jury of your peers, and not the gov't.
148. dragonwriter ◴[] No.41906722{5}[source]
> The point at which we decide that a defect is serious enough to transfer liability is quite case-dependent. If you knew that the throttle was glitchy but hadn't done anything to fix it, yes. If it affected every car from the manufacturer, it's obviously their fault -- but if you ignore the recall then it might be your fault again?

In most American jurisdictions' liability law, the more usual thing is to expand liability, rather than transferring liability. The idea that exactly one -- or at most one -- person or entity should be liable for any given portion of any given harm is a common popular one in places like HN, but the law is much more accepting of the situation where lots of people may have overlapping liability for the same harm, with none relieving the others.

The liability of a driver for maintenance and operation within the law is not categorically mutually exclusive with the liability of the manufacturer (and, indeed, every party in the chain of commerce) for manufacturing defects.

If a car is driven in a way that violates the rules of the road and causes an accident and a manufacturing defect in a driver assistance system contributed to that, it is quite possible for the driver, manufacturer of the driver assistance system, manufacturer of the vehicle (if different from that of the assistance system) and seller of the vehicle to the driver (if different from the last two), among others, to all be fully liable to those injured for the harms.