Most active commenters
  • bayindirh(4)
  • sokoloff(4)
  • demarq(3)
  • AlecSchueler(3)
  • sillyfluke(3)
  • const_cast(3)

←back to thread

747 points porridgeraisin | 45 comments | | HN request time: 0.592s | source | bottom
1. aurareturn ◴[] No.45062782[source]
Just opened Claude app on Mac and saw a popup asking me if it's ok to train on my chats. It's on by default. Unchecked it.

I think Claude saw that OpenAI was reaping too much benefit from this so they decided to do it too.

replies(5): >>45062800 #>>45062824 #>>45062865 #>>45063224 #>>45065138 #
2. demarq ◴[] No.45062800[source]
Also your chats will now be stored for 5 years.
replies(2): >>45062821 #>>45062948 #
3. aurareturn ◴[] No.45062821[source]
I used to not care about this stuff but with the way this administration is going about things, I suddenly care very much about it.
replies(2): >>45062871 #>>45062903 #
4. staticman2 ◴[] No.45062824[source]
Given how competitive Claude has been with ChatGPT models without training on users I'm curious how useful OpenAI could have found it.
5. echelon ◴[] No.45062865[source]
We should be able to train on foundation model outputs.

These bastard companies pirated the world's data, then they train on our personal data. But they have the gall to say we can't save their model's inputs and outputs and distill their models.

replies(2): >>45062880 #>>45063053 #
6. bayindirh ◴[] No.45062871{3}[source]
Trusting companies more than the government always feels strange. It's something I can't grasp.
replies(7): >>45062913 #>>45062918 #>>45062925 #>>45062928 #>>45062971 #>>45063033 #>>45063100 #
7. jacooper ◴[] No.45062880[source]
You can, they might not like it but there's no legal basis saying you can't.
replies(1): >>45062919 #
8. demarq ◴[] No.45062903{3}[source]
It’s more that five years worth of peoples most personal conversations is an absolute treasure trove and makes their systems much more inviting for hackers and yes governments.

The part that irks me is that this includes people who are literally paying for the service.

9. demarq ◴[] No.45062913{4}[source]
One has next to no consequences or oversight
10. aleph_minus_one ◴[] No.45062918{4}[source]
Why not distrust both?! :-)
replies(1): >>45062968 #
11. datadrivenangel ◴[] No.45062919{3}[source]
violating terms and conditions can be sufficient to be at least charged with computer abuse and fraud.
replies(1): >>45063048 #
12. elzbardico ◴[] No.45062925{4}[source]
Trusting any of them is a luxury afforded in a short period of history in rich countries.

That's why the usual ethos in places like HN of treating any doubt about government actions as lowbrow paranoid conspiracy theory stuff, is so exasperating, for those of us who came from either the former soviet bloc or third world nations.

replies(1): >>45063405 #
13. AlecSchueler ◴[] No.45062928{4}[source]
How many companies can disappear me to El Salvador?
replies(3): >>45063004 #>>45063089 #>>45063825 #
14. OtherShrezzing ◴[] No.45062948[source]
And there's no way to opt-out of the training, without agreeing to the 5 year retention. Anthropic has slipped so far and fast from its objective of being the ethical AI company.
replies(1): >>45063194 #
15. ◴[] No.45062968{5}[source]
16. slipperydippery ◴[] No.45062971{4}[source]
I don’t get drawing a distinction. If a company has it, there’s at least one government out there that either also already has it (some telecom companies just give them data portals, for example) or can any time they choose.

Corporate surveillance is government surveillance. Always has been.

17. giraffe_lady ◴[] No.45063004{5}[source]
Well relatedly I think several of the tech billionaires considered this question and decided the answer was "not enough."
18. twoquestions ◴[] No.45063033{4}[source]
I 90% agree with you, though Apple did stand up to the FBI some years ago. The US gov't at least is much more restricted on what data it can collect and act on due to the 4th Amendment among other laws, and as another commenter said Apple can't blackbag me to El Salvador.

Apple/FBI story in question: https://apnews.com/general-news-c8469b05ac1b4092b7690d36f340...

replies(1): >>45063060 #
19. echelon ◴[] No.45063048{4}[source]
That's disgusting.

We need a Galoob vs. Nintendo [1], Sony vs. Universal [2], or whatever that TiVo case was (I don't think it was TiVo vs. EchoStar). A case that establishes anyone can scrape and distill models.

[1] https://en.wikipedia.org/wiki/Lewis_Galoob_Toys,_Inc._v._Nin....

[2] https://en.wikipedia.org/wiki/Sony_Corp._of_America_v._Unive....

20. elzbardico ◴[] No.45063053[source]
I am pretty sure they try to do it all the time between themselves. Most of the real sauce in AI coding comes from reinforcement learning, usually done by armies of third world outsourced developers tediously doing all kinds of tasks with instructions to detail their reasoning behind each chance. Things like: "to run this python test in a docker container with the python image we need to install the python package xyz, but then, as it has some native code, we also need to install build-essential..."

While those developers are not well paid (usually around 30/40 USD hour, no benefits), you need a lot of then, so, it is a big temptation to create also as much synthetic data sets from your more capable competitor.

Given the fact that AI companies have this Jihad zeal to achieve their goals no matter what (like, fuck copyright, fuck the environment, etc, etc), it would be naive to believe they don't at least try to do it.

And even if they don't do it directly, their outsourced developers will do it indirectly by using AI to help with their tasks.

replies(1): >>45063148 #
21. bayindirh ◴[] No.45063060{5}[source]
Apple is an exception, and even that is debatable because of the unencrypted backups they store.

On the other hand, what Apple did is a tangible thing and is a result.

This gives them better optics for now, but there is no law says that they can't change.

Their business model is being an "accessible luxury brand with the privacy guarantee of Switzerland as the laws allow". So, as another argument, they have to do this.

22. sillyfluke ◴[] No.45063089{5}[source]
"US Army appoints Palantir, Meta, OpenAI execs as Lt. Colonels" [0]

Well, probably easier than you think. Given that it looks like Palantir is able to control the software and hardware of the new fangled detention centers with immunity, how difficult do you think it is for them to disappear someone without any accountability?

It is precisely the blurring of the line between gov and private companies that aid in subverting the rule of law in many instances.

[0] https://thegrayzone.com/2025/06/18/palantir-execs-appointed-...

replies(2): >>45063330 #>>45077638 #
23. sokoloff ◴[] No.45063100{4}[source]
The government has the direct power to imprison me or seize my property if cross them.

It seems strange to not be able to grasp the difference in kind here.

replies(2): >>45063136 #>>45063861 #
24. bayindirh ◴[] No.45063136{5}[source]
What happens if your Google account is locked out because you shared your son's pictures to his M.D. because of an ongoing treatment?

What happens the same company locks all your book drafts because an algorithm deemed that you're plotting something against someone?

Both are real events, BTW.

replies(1): >>45063160 #
25. sokoloff ◴[] No.45063148{3}[source]
> those developers are not well paid (usually around 30/40 USD hour, no benefits)

$40/hour for a full time would put you just over the median household income for the US.

I suspect this provides quite a good living for their family and the devs doing the work feel like they’re well-paid.

replies(1): >>45064403 #
26. sokoloff ◴[] No.45063160{6}[source]
I think I missed the part where Google imprisoned someone.

The government forces me to do business with them; if I don't pay them tens (and others hundreds) of thousands of dollars every year they will send people with guns to imprison me and eventually other people with guns to seize my property.

Me willingly giving Google some data and them capriciously deciding to not always give it back doesn't seem anything like the same to me. (It doesn't mean I like what Google's doing, but they have nowhere near the power of the group that legally owns and uses tanks.)

replies(1): >>45063213 #
27. smca ◴[] No.45063194{3}[source]
> If you do not choose to provide your data for model training, you’ll continue with our existing 30-day data retention period.

https://www.anthropic.com/news/updates-to-our-consumer-terms

28. bayindirh ◴[] No.45063213{7}[source]
Their life effectively stopped since they are locked out of everything, forever. Not forgetting that the first guy's son's pictures are ended in a CSAM database and he lost his account permanently, and Google didn't give his account back [0].

A company "applied what the law said", and refused that they made a mistake and overreached. Which is generally attributed to governments.

So, I you missed the effects of this little binary flag on their life.

[0]: https://www.theguardian.com/technology/2022/aug/22/google-cs...

replies(1): >>45063293 #
29. fusslo ◴[] No.45063224[source]
My work just signed to an enterprise agreement with anthropic. I just checked, and "Your data will not be trained on or used to improve the product. Code is stored to personalize your experience. Applies to all team members."
30. sokoloff ◴[] No.45063293{8}[source]
> Their life effectively stopped since they are locked out of everything

What?! Google locked them out of Google. I'm sure they can still get search, email, and cloud services from many other providers.

The government can lock you away in a way that is far more impactful and much closer to "life stopped; locked out of everything" than "you can't have the data you gave us back".

replies(1): >>45063591 #
31. AlecSchueler ◴[] No.45063330{6}[source]
Oh I have no doubt those lines are becoming more and more blurred and that certain big companies in key positions are theoretically beyond accountability.

But the question was "why trust a company and not the government?"

So even now it's between:

  * A company who, if big enough and in a key position, could theoretically do this
And

  * A government who we know for sure have grabbed multiple people off the streets, within the past month, and have trafficked them out of the country without any due process. 
So it's still "could maybe do harm" versus "already controls an army of masked men who are undeniably active in doing harm."
replies(2): >>45063473 #>>45063813 #
32. 6510 ◴[] No.45063405{5}[source]
Someone who use to live in a dictatorship told me there is one advantage to living under a dictator: No one believes what is said in the news or the official version of anything.
33. sillyfluke ◴[] No.45063473{7}[source]
>But the question was "why trust a company and not the government?"

The post you were replying to simply said the behavior of this administration made them care more about this issue, not that they trusted companies more than the government. That statement is not even implied in anyway in the comment you responded to?

The fact is whereas in the past it would be expected that the government could regulate the brutal and illegal overreaches of private companies, giving military rank to private companies execs makes that even less likely. The original comment is alluding to a simpler point: A government that gives blank checks to private companies in military and security matters is much worse than one that doesn't.

replies(1): >>45063876 #
34. degamad ◴[] No.45063591{9}[source]
Being locked out of your email which is the user name for most of the services you access is a lot more than "you can't have your data back". It's you can't log on to anything which uses email 2fa, you can't restore access to other services, you can't validate your identity with online government services, you don't get your bank statements or warnings, etc. It's not as bad as being arrested, but it is massively disruptive to your life.
35. Cheer2171 ◴[] No.45063813{7}[source]
More like do you trust what's left of the US judicial branch versus the private arbitration company to save you from the excesses of their respective executives.

I'll still take an increasingly stacked US federal court that still has to pay lip service to the constitution over private arbitration hired by the company accountable only to their whims.

What you mentioned has been repeatedly ruled unconstitutional, but the administration is ignoring the courts.

36. const_cast ◴[] No.45063825{5}[source]
And how much can the US government censor you versus companies?

There's tradeoffs. The government, at least, has to abide by the constitution. Companies don't have to abide by jack shit.

That means infinite censorship, searches and seizures, discrimination, you name it.

We have SOME protection. Very few, but they're there. But if Uber was charging black people 0.50 cents more on average because their pricing model has some biases baked in, would anyone do anything?

replies(1): >>45064591 #
37. const_cast ◴[] No.45063861{5}[source]
And what technology do you think they use to do said imprisonment and seizing?

Why do you think the military and police outsource fucking everything to the private sector? Because there are no rules there.

Wanna make the brown people killer 5000 drone? Sure, go ahead. Wanna make a facial crime recognition system that treats all black faces as essentially the same? Sure, go ahead. Wanna run mass censorship and propaganda campaigns? Sure, go ahead.

The private sector does not abide by the constitution.

Look, stamping out a protest and rolling tanks is hard. Its gonna get on the news, it's gonna be challenged in court, the constitution exists, it's just a whole thing.

Just ask Meta to do it. Probably more effective anyway.

38. AlecSchueler ◴[] No.45063876{8}[source]
The comment I responded to said "Trusting companies more than the government always feels strange. It's something I can't grasp."
replies(1): >>45063922 #
39. sillyfluke ◴[] No.45063922{9}[source]
You're right, my bad. I meant the original context of the grandparent
40. questionableans ◴[] No.45064403{4}[source]
I would love to see less pay inequality, but unfortunately, the median household in the US really doesn’t have it great due to the costs and risks of everyday life.

For comparison, I live in a place that is typically considered as tier 3 or 4 out of 4 in the US by employers (4 being the cheapest). Costs of living are honestly more like tier 2 cities, but it’s a small city in a poor state. 7 years ago, the going rate for an unlicensed handyman was $32/hour, often paid under the table in cash (I don’t have more recent numbers because I find DIY better and easier than hiring someone reliable).

41. SoftTalker ◴[] No.45064591{6}[source]
Yes, because race is a protected class.

If they were charging wealthy people 0.50 more on average because the model showed that they don't care about price that much, they would be fine.

replies(1): >>45066128 #
42. cowboylowrez ◴[] No.45065138[source]
I hope they didn't vibe code the popup, that could be bad if it didn't actually work.
43. const_cast ◴[] No.45066128{7}[source]
> Yes, because race is a protected class.

No: because Uber doesn't have to tell you how their model works and they probably don't even know.

replies(1): >>45066712 #
44. SoftTalker ◴[] No.45066712{8}[source]
Doesn't matter. If you can convincingly argue that the effect is discrimination based on race, you have a civil rights claim.
45. CatWChainsaw ◴[] No.45077638{6}[source]
Fascism is definitionally when government and companies team up to screw everyone else.