Am I the only one that assumed everything was already being used for training?
replies(9):
The cynic in me wonders if part of Anthropic's decision process here was that, since nobody believes you when you say you're not using their data for training, you may as well do it anyway!
Giving people an opt-out might even increase trust, since people can now at least see an option that they control.
This is why I love-hate Anthro, the same way I love-hate Apple. The reason is simple: Great product, shitty MBA-fueled managerial decisions.