←back to thread

747 points porridgeraisin | 1 comments | | HN request time: 0.001s | source
Show context
troad ◴[] No.45062852[source]
You can opt out, but the fact that it's opt-in by default and made to look like a simple T/C update prompt leaves a sour taste in my mouth. The five year retention period seems... excessive. I wonder if they've buried anything else objectionable in the new terms.

It was the kick in the pants I needed to cancel my subscription.

replies(22): >>45062875 #>>45062894 #>>45062895 #>>45062930 #>>45062936 #>>45062949 #>>45062975 #>>45063015 #>>45063070 #>>45063116 #>>45063150 #>>45063171 #>>45063186 #>>45063387 #>>45063615 #>>45064792 #>>45064955 #>>45064986 #>>45064996 #>>45066593 #>>45070194 #>>45074231 #
demarq ◴[] No.45062949[source]
Are you sure the opt out isn’t only training? The retention does not seem affected by the toggle.
replies(2): >>45063038 #>>45063233 #
1. jasona123 ◴[] No.45063038[source]
From the PR update: https://www.anthropic.com/news/updates-to-our-consumer-terms

“If you do not choose to provide your data for model training, you’ll continue with our existing 30-day data retention period.“

From the support page: https://privacy.anthropic.com/en/articles/10023548-how-long-...

“If you choose not to allow us to use your chats and coding sessions to improve Claude, your chats will be retained in our back-end storage systems for up to 30 days.”