And as the remedy starts being applied (aka "liability"), the enthusiasm for AI will start to wane.
I wouldn't be surprised if some businesses ban the use of AI --- starting with law firms.
And as the remedy starts being applied (aka "liability"), the enthusiasm for AI will start to wane.
I wouldn't be surprised if some businesses ban the use of AI --- starting with law firms.
And as the remedy starts being applied (aka "liability"), the enthusiasm for software will start to wane.
What if anything do you think is wrong with my analogy? I doubt most people here support strict liability for bugs in code.
I think what is clearly wrong with your analogy is assuming that AI applies mostly to software and code production. This is actually a minor use-case for AI.
Government and businesses of all types ---doctors, lawyers, airlines, delivery companies, etc. are attempting to apply AI to uses and situations that can't be tested in advance the same way "vibe" code can. And some of the adverse results have already been ruled on in court.