←back to thread

439 points diggan | 6 comments | | HN request time: 0.827s | source | bottom
1. mrdependable ◴[] No.45066505[source]
This going to turn into one of those situations where we find out they trained on everyone whether they opted-out or not down the line. I want to keep using Claude, but I also don't want all the solutions I come up with to become common knowledge.
replies(3): >>45066616 #>>45066931 #>>45066953 #
2. treyd ◴[] No.45066616[source]
Why don't you want to share your insights? I agree doing it in a more direct way would be better than it leaking through AI training you don't control. But your phrasing seems stronger than that.
3. ukd1 ◴[] No.45066931[source]
I think I'm fine with the whole getting better due to something helped it / co find with it. I'm not happy if it's directly 1:1 or attributed to me - chatham house rule for this would be great.
replies(1): >>45072310 #
4. skybrian ◴[] No.45066953[source]
When has a company ignored an opt-out preference before? It sounds like you have something in mind.
replies(1): >>45069010 #
5. tagawa ◴[] No.45069010[source]
Not the OP but…

https://www.jdsupra.com/legalnews/healthline-media-agrees-to...

"Healthline.com provided an opt-out mechanism, but it was misconfigured and Healthline failed to test it, resulting in data being shared with third parties even after consumers elected to opt out.”

https://www.bbc.com/news/technology-65772154

"The company agreed to pay the US Federal Trade Commission (FTC) after it was accused of failing to delete Alexa recordings at the request of parents.”

https://www.mediapost.com/publications/article/405635/califo...

"According to the [California Privacy Protection] agency, Todd Snyder told website visitors they could opt out of data sharing, but didn't actually allow them to do so for 40 days in late 2023 because its opt-out mechanism was improperly configured."

6. adastra22 ◴[] No.45072310[source]
That’s impossible. You can’t anonymize data at scale.