←back to thread

439 points diggan | 1 comments | | HN request time: 0.198s | source
Show context
AlecSchueler ◴[] No.45062904[source]
Am I the only one that assumed everything was already being used for training?
replies(9): >>45062929 #>>45063168 #>>45063951 #>>45064966 #>>45065323 #>>45065428 #>>45065912 #>>45066950 #>>45070135 #
Aurornis ◴[] No.45065912[source]
I don't understand this mindset. Why would you assume anything? It took me a couple minutes at most to check when I first started using Claude.

I check when I start using any new service. The cynical assumption that everything's being shared leads to shrugging it off and making no attempt to look for settings.

It only takes a moment to go into settings -> privacy and look.

replies(7): >>45065932 #>>45065968 #>>45066053 #>>45066125 #>>45068206 #>>45068998 #>>45070223 #
hshdhdhj4444 ◴[] No.45065932[source]
Huh, they’re not assuming anything is “being shared”.

They’re assuming that Anthropic that is already receiving and storing your data, is also training their models on that data.

How are you supposed to disprove that as a user?

Also, the whole point is that companies cannot be trusted to follow the settings.

replies(1): >>45067803 #
simonw ◴[] No.45067803[source]
Why can't companies be trusted to follow the settings?

If they add those settings why would you expect they wouldn't respect them? Do you think they're purely cosmetic features that don't actually do anything?

replies(3): >>45070033 #>>45070257 #>>45080866 #
fcarraldo ◴[] No.45070033[source]
Because they can’t be?

https://www.reuters.com/sustainability/boards-policy-regulat...

https://www.bbc.com/news/articles/cx2jmledvr3o

replies(1): >>45071304 #
1. simonw ◴[] No.45071304[source]
There is an enormous gap between the behavior covered in those two cases and training machine learning models on user data that a company has specifically said it will not use for training.