←back to thread

747 points porridgeraisin | 3 comments | | HN request time: 0.621s | source
Show context
Syzygies ◴[] No.45063736[source]
Claude assists me in my math research.

The scenario that concerns me is that Claude learns unpublished research ideas from me as we chat and code. Claude then suggests these same ideas to someone else, who legitimately believes this is now their work.

Clearly commercial accounts use AI to assist in developing intellectual product, and privacy is mandatory. The same can apply to individuals.

replies(9): >>45063744 #>>45064034 #>>45064105 #>>45064140 #>>45064248 #>>45064416 #>>45064428 #>>45065522 #>>45065601 #
1. Aurornis ◴[] No.45063744[source]
When you get the pop-up about the new terms, select the “opt out” option. Then your chats will not be used for training.
replies(1): >>45064328 #
2. Klonoar ◴[] No.45064328[source]
Well, theoretically they won’t.

Anyone who’s worked in an engineering team is familiar with someone forgetting to check ‘if(doNotDoThisCondition)’.

This is why (among many other reasons) opt-in is more user respecting here than opt-out.

replies(1): >>45064436 #
3. SoftTalker ◴[] No.45064436[source]
Forgetting. Riiighht.