←back to thread

747 points porridgeraisin | 1 comments | | HN request time: 0s | source
Show context
aurareturn ◴[] No.45062782[source]
Just opened Claude app on Mac and saw a popup asking me if it's ok to train on my chats. It's on by default. Unchecked it.

I think Claude saw that OpenAI was reaping too much benefit from this so they decided to do it too.

replies(5): >>45062800 #>>45062824 #>>45062865 #>>45063224 #>>45065138 #
demarq ◴[] No.45062800[source]
Also your chats will now be stored for 5 years.
replies(2): >>45062821 #>>45062948 #
OtherShrezzing ◴[] No.45062948[source]
And there's no way to opt-out of the training, without agreeing to the 5 year retention. Anthropic has slipped so far and fast from its objective of being the ethical AI company.
replies(1): >>45063194 #
1. smca ◴[] No.45063194[source]
> If you do not choose to provide your data for model training, you’ll continue with our existing 30-day data retention period.

https://www.anthropic.com/news/updates-to-our-consumer-terms