←back to thread

747 points porridgeraisin | 2 comments | | HN request time: 0s | source
Show context
psychoslave ◴[] No.45062941[source]
What a surprise, a big corp collected large amount of personal data under some promises, and now reveals actually they will exploit it in completely unrelated manner.
replies(7): >>45062982 #>>45063078 #>>45063239 #>>45064031 #>>45064041 #>>45064193 #>>45064287 #
the_arun ◴[] No.45064041[source]
From Anthropic communication:

> If you’re an existing user, you have until September 28, 2025 to accept the updated Consumer Terms and make your decision. If you choose to accept the new policies now, they will go into effect immediately. These updates will apply only to new or resumed chats and coding sessions. After September 28, you’ll need to make your selection on the model training setting in order to continue using Claude. You can change your choice in your Privacy Settings at any time.

Doesn’t say clearly it applies to all the prompts from the past.

https://www.anthropic.com/news/updates-to-our-consumer-terms

replies(1): >>45064143 #
bubblyworld ◴[] No.45064143[source]
Under the FAQ:

> Previous chats with no additional activity will not be used for model training.

replies(4): >>45064247 #>>45064253 #>>45064399 #>>45064949 #
1. SantalBlush ◴[] No.45064247[source]
That will be quietly removed later.
replies(1): >>45064365 #
2. SoftTalker ◴[] No.45064365[source]
All your data are belong to us.