Most active commenters

    ←back to thread

    747 points porridgeraisin | 11 comments | | HN request time: 0.001s | source | bottom
    Show context
    superposeur ◴[] No.45064455[source]
    Everyone seems to be unsurprised by this move, but I’m genuinely shocked. What a shoot your own foot business decision. Google, evil though it be, doesn’t post the text of your gmails in its search results because who would consider using Gmail after that? This is the llm equivalent. Am I missing something?
    replies(7): >>45064592 #>>45064626 #>>45064638 #>>45064681 #>>45064737 #>>45064752 #>>45065348 #
    1. KoolKat23 ◴[] No.45064592[source]
    This data is useful for reinforcement learning. All the others do it.

    And most importantly, you can just opt-out.

    replies(3): >>45064613 #>>45064705 #>>45064753 #
    2. turnsout ◴[] No.45064613[source]
    You can't opt out of the data retention policy.
    replies(1): >>45064734 #
    3. superposeur ◴[] No.45064705[source]
    Ok, to be clear, let’s say I’m dumb and accidentally go with the default (I get the color of the opt out button wrong or something). As if there’s a “publish my private emails to the internet” default-on button in email. Then, I use it to edit a rec letter for student X, with my signature Y. (Yes I know this is dumb and I try changing names when editing but am sure some actual names may slip through.) A few months later the next model is released trained on the data. Student X asks Claude what Y would write in a rec letter about X. Such a button is a “wings stay on / wings fall off” button on a plane.
    replies(1): >>45064868 #
    4. smca ◴[] No.45064734[source]
    The data retention period is 30 days if you don't choose to improve model training. https://www.anthropic.com/news/updates-to-our-consumer-terms...
    replies(1): >>45064813 #
    5. behnamoh ◴[] No.45064753[source]
    Just because all the others do it doesn’t make it right. Many users chose Anthropic exactly because they were not like the others.
    replies(3): >>45065150 #>>45065215 #>>45071208 #
    6. turnsout ◴[] No.45064813{3}[source]
    Oh, I didn't catch this—that's good news
    7. franga2000 ◴[] No.45064868[source]
    You're severely overestimating the ability of the model to recall a single mostly uninteresting item from it's billions of input documents.
    8. wolvesechoes ◴[] No.45065150[source]
    > Many users chose Anthropic exactly because they were not like the others.

    Oh the naivety.

    Sooner or later they all become the same, soon after "investors" or "shareholders" arrive.

    replies(1): >>45065271 #
    9. KoolKat23 ◴[] No.45065215[source]
    There's no reason to be shocked by the practice however.
    10. behnamoh ◴[] No.45065271{3}[source]
    > Sooner or later they all become the same, soon after "investors" or "shareholders" arrive.

    They already arrived. Google was one of the main investors of Anthro.

    11. const_cast ◴[] No.45071208[source]
    > Many users chose Anthropic exactly because they were not like the others.

    Companies are less like people and more like bacteria. They are programmatic, like algorithms.

    What they will do has already been decided for them, programmed into them, by the rules of capitalism. It is inevitable. There are no good guys, and there are no bad guys, there's just... microbes.

    Those who do not engage in capitalism, perhaps they do not seek money at all, have no such hard limitations. But they are rare, because money is blood.