In my opinion, training models on user data without their real consent (real consent = e.g. the user must sign a contract or so, so he's definitely aware), should be considered a serious criminal offense.
I think it's cute people believe companies that trained their models with every single book and online page ever written without consents from authors (and often against the explicit request of the author without any opt-out) won't do a rugg-pull and do it also to all the chats they have aquired...
You're absolutely right, but also isn't the volume of new data they are getting from chats tiny compared to what they've already trained on? I'm wondering how much difference it will really make.