←back to thread

439 points diggan | 7 comments | | HN request time: 0s | source | bottom
Show context
AlecSchueler ◴[] No.45062904[source]
Am I the only one that assumed everything was already being used for training?
replies(9): >>45062929 #>>45063168 #>>45063951 #>>45064966 #>>45065323 #>>45065428 #>>45065912 #>>45066950 #>>45070135 #
Aurornis ◴[] No.45065912[source]
I don't understand this mindset. Why would you assume anything? It took me a couple minutes at most to check when I first started using Claude.

I check when I start using any new service. The cynical assumption that everything's being shared leads to shrugging it off and making no attempt to look for settings.

It only takes a moment to go into settings -> privacy and look.

replies(7): >>45065932 #>>45065968 #>>45066053 #>>45066125 #>>45068206 #>>45068998 #>>45070223 #
1. hshdhdhj4444 ◴[] No.45065932[source]
Huh, they’re not assuming anything is “being shared”.

They’re assuming that Anthropic that is already receiving and storing your data, is also training their models on that data.

How are you supposed to disprove that as a user?

Also, the whole point is that companies cannot be trusted to follow the settings.

replies(1): >>45067803 #
2. simonw ◴[] No.45067803[source]
Why can't companies be trusted to follow the settings?

If they add those settings why would you expect they wouldn't respect them? Do you think they're purely cosmetic features that don't actually do anything?

replies(3): >>45070033 #>>45070257 #>>45080866 #
3. fcarraldo ◴[] No.45070033[source]
Because they can’t be?

https://www.reuters.com/sustainability/boards-policy-regulat...

https://www.bbc.com/news/articles/cx2jmledvr3o

replies(1): >>45071304 #
4. fcarraldo ◴[] No.45070257[source]
Also currently being discussed[0], on this very site, is both speculation that Meta is surreptitiously scanning your camera roll and a comment claiming that they worked on an earlier implementation to do just that.

It’s shocking to me that anyone who works in our industry would trust any company to do as they claim.

[0] https://news.ycombinator.com/item?id=45062910

5. simonw ◴[] No.45071304{3}[source]
There is an enormous gap between the behavior covered in those two cases and training machine learning models on user data that a company has specifically said it will not use for training.
6. AlecSchueler ◴[] No.45080866[source]
Have you really never heard of companies saying one thing while doing another?
replies(1): >>45081600 #
7. simonw ◴[] No.45081600{3}[source]
Yes, normally when they lose a lawsuit over it.