←back to thread

989 points acomjean | 1 comments | | HN request time: 0s | source
Show context
aeon_ai ◴[] No.45143392[source]
To be very clear on this point - this is not related to model training.

It’s important in the fair use assessment to understand that the training itself is fair use, but the pirating of the books is the issue at hand here, and is what Anthropic “whoopsied” into in acquiring the training data.

Buying used copies of books, scanning them, and training on it is fine.

Rainbows End was prescient in many ways.

replies(36): >>45143460 #>>45143461 #>>45143507 #>>45143513 #>>45143567 #>>45143731 #>>45143840 #>>45143861 #>>45144037 #>>45144244 #>>45144321 #>>45144837 #>>45144843 #>>45144845 #>>45144903 #>>45144951 #>>45145884 #>>45145907 #>>45146038 #>>45146135 #>>45146167 #>>45146218 #>>45146268 #>>45146425 #>>45146773 #>>45146935 #>>45147139 #>>45147257 #>>45147558 #>>45147682 #>>45148227 #>>45150324 #>>45150567 #>>45151562 #>>45151934 #>>45153210 #
rchaud ◴[] No.45144837[source]
> Buying used copies of books, scanning them, and training on it is fine.

But nobody was ever going to that, not when there are billions in VC dollars at stake for whoever moves fastest. Everybody will simply risk the fine, which tends to not be anywhere close to enough to have a deterrent effect in the future.

That is like saying Uber would have not had any problems if they just entered into a licensing contract with taxi medallion holders. It was faster to just put unlicensed taxis on the streets and use investor money to pay fines and lobby for favorable legislation. In the same way, it was faster for Anthropic to load up their models with un-DRM'd PDFs and ePUBs from wherever instead of licensing them publisher by publisher.

replies(15): >>45144965 #>>45145196 #>>45145216 #>>45145270 #>>45145297 #>>45145300 #>>45145388 #>>45146392 #>>45146407 #>>45146846 #>>45147108 #>>45147461 #>>45148242 #>>45152291 #>>45152841 #
ReFruity ◴[] No.45144965[source]
> But nobody was ever going to that

If this is a choice between risking to pay 1.5 billion or just paying 15 mil safely, they might.

replies(2): >>45145247 #>>45145248 #
crote ◴[] No.45145247[source]
Option 1: $183B valuation, $1.5B settlement.

Option 2: near-$0 valuation, $15M purchasing cost.

To an investor, that just looks like a pretty good deal, I reckon. It's just the cost of doing business - which in my opionion is exactly what is wrong with practices like these.

replies(2): >>45145553 #>>45146344 #
fn-mote ◴[] No.45145553[source]
> which in my opionion is exactly what is wrong with practices like these.

What's actually wrong with this?

They paid $1.5B for a bunch of pirated books. Seems like a fair price to me, but what do I know.

The settlement should reflect society's belief of the cost or deterrent, I'm not sure which (maybe both).

This might be controversial, but I think a free society needs to let people break the rules if they are willing to pay the cost. Imagine if you couldn't speed in a car. Imagine if you couldn't choose to be jailed for nonviolent protest.

This isn't some case where they destroyed a billion dollars worth of pristine wilderness and got off with a slap on the wrist.

replies(6): >>45145713 #>>45145807 #>>45145851 #>>45146427 #>>45147457 #>>45148231 #
zmmmmm ◴[] No.45145713[source]
> I think a free society needs to let people break the rules if they are willing to pay the cost

so you don't think super rich people should be bound by laws at all?

Unless you made the cost proportional to (maybe expontial to) somebody's wealth, you would be creating a completely lawless class who would wreak havoc on society.

replies(4): >>45145819 #>>45147306 #>>45147326 #>>45150397 #
LMYahooTFY ◴[] No.45147306[source]
The law was not broken by "super rich people".

It was broken by a company of people who were not very rich at all and have managed to produce billions in value (not dollars, value) by breaking said laws.

They're not trafficking humans or doing predatory lending, they're building AI.

This is why our judicial system literally handles things on a case by case basis.

replies(2): >>45147488 #>>45148253 #
1. kelnos ◴[] No.45148253{4}[source]
> It was broken by a company of people who were not very rich at all

I think the company's bank account would beg to differ on that.

> managed to produce billions in value (not dollars, value) by breaking said laws.

Ah, so breaking the law is ok if enough "value" is created? Whatever that means?

> They're not trafficking humans or doing predatory lending, they're building AI.

They're not trafficking humans or doing predatory lending, they're infringing on the copyright of book authors.

Not sure why you ended that sentence with "building AI", as that's not comparing apples to apples.

But sure, ok, so it's ok to break the law if you, random person on the internet, think their end goals are worthwhile? So the ends justify the means, huh?

> This is why our judicial system literally handles things on a case by case basis.

Yes, and Anthropic was afraid enough of an unfavorable verdict in this particular case that they paid a billion and a half to make it go away.