https://investors.autodesk.com/news-releases/news-release-de...
https://investors.autodesk.com/news-releases/news-release-de...
Come up with a better comparison.
First, Authors argue that using works to train Claude’s underlying LLMs
was like using works to train any person to read and write, so Authors
should be able to exclude Anthropic from this use (Opp. 16).
Second, to that last point, Authors further argue that the training was
intended to memorize their works’ creative elements — not just their
works’ non-protectable ones (Opp. 17).
Third, Authors next argue that computers nonetheless should not be
allowed to do what people do.
https://media.npr.org/assets/artslife/arts/2025/order.pdf> First, Authors argue that using works to train Claude’s underlying LLMs was like using works to train any person to read and write, so Authors should be able to exclude Anthropic from this use (Opp. 16). But Authors cannot rightly exclude anyone from using their works for training or learning as such. Everyone reads texts, too, then writes new texts. They may need to pay for getting their hands on a text in the first instance. But to make anyone pay specifically for the use of a book each time they read it, each time they recall it from memory, each time they later draw upon it when writing new things in new ways would be unthinkable. For centuries, we have read and re-read books. We have admired, memorized, and internalized their sweeping themes, their substantive points, and their stylistic solutions to recurring writing problems.
Couldn't have put it better myself (though $deity knows I tried many times on HN). Glad to see Judge Alsup continues to be the voice of common sense in legal matters around technology.
Yep, that name's a blast from the past! He was the judge on the big Google/Oracle case about Android and Java years ago, IIRC. I think he even learned to write some Java so he could better understand the case.