←back to thread

747 points porridgeraisin | 1 comments | | HN request time: 0s | source
Show context
I_am_tiberius ◴[] No.45062905[source]
In my opinion, training models on user data without their real consent (real consent = e.g. the user must sign a contract or so, so he's definitely aware), should be considered a serious criminal offense.
replies(5): >>45062989 #>>45063008 #>>45063221 #>>45063771 #>>45064402 #
1. zajio1am ◴[] No.45064402[source]
Why? This is not 'use collected information to targed ads', or 'sell collected information to third parties', but 'use collected information from the service to improve the service'. Does not really seems to me much different than ISPs using traffic stats to plan infrastructure improvements, or a website using access logs to improve accessibility and navigation.

And when talking specifically about AI, one could argue that learning from interactions is a common aspect of intelligence, so a casual user who do not understand details about LLMs would expect so anyways. Also, the fact that LLMs (and other neural networks) have distinct training and inference phases seems more like an implementation detail.