←back to thread

745 points melded | 1 comments | | HN request time: 0.217s | source
Show context
joshcsimmons ◴[] No.45946838[source]
This is extremely important work thank you for sharing it. We are in the process of giving up our own moral standing in favor of taking on the ones imbued into LLMs by their creators. This is a worrying trend that will totally wipe out intellectual diversity.
replies(13): >>45947071 #>>45947114 #>>45947172 #>>45947465 #>>45947562 #>>45947687 #>>45947790 #>>45948200 #>>45948217 #>>45948706 #>>45948934 #>>45949078 #>>45976528 #
EbEsacAig ◴[] No.45947071[source]
> We are in the process of giving up our own moral standing in favor of taking on the ones imbued into LLMs by their creators. This is a worrying trend that will totally wipe out intellectual diversity.

That trend is a consequence. A consequence of people being too lazy to think for themselves. Critical thinking is more difficult than simply thinking for yourself, so if someone is too lazy to make an effort and reaches for an LLM at once, they're by definition ill-equipped to be critical towards the cultural/moral "side-channel" of the LLM's output.

This is not new. It's not random that whoever writes the history books for students has the power, and whoever has the power writes the history books. The primary subject matter is just a carrier for indoctrination.

Not that I disagree with you. It's always been important to use tools in ways unforeseen, or even forbidden, by their creators.

Personally, I distrust -- based on first hand experience -- even the primary output of LLMs so much that I only reach for them as a last resort. Mostly when I need a "Google Search" that is better than Google Search. Apart from getting quickly verifiable web references out of LLMs, their output has been a disgrace for me. Because I'm mostly opposed even to the primary output of LLMs, to begin with, I believe to be somewhat protected from their creators' subliminal messaging. I hope anyway.

replies(4): >>45947459 #>>45947877 #>>45951530 #>>45955861 #
Eisenstein ◴[] No.45955861[source]
> Because I'm mostly opposed even to the primary output of LLMs, to begin with, I believe to be somewhat protected from their creators' subliminal messaging. I hope anyway.

Being afraid that you are not solid enough in your own conclusions such that you have to avoid something which might convince you otherwise is not critical thinking, and is in fact the opposite of it.

replies(1): >>45960415 #
1. EbEsacAig ◴[] No.45960415[source]
I agree with you, but your statement doesn't seem to contradict my point. The reason I avoid LLMs is not that I'm too fearful to have my morals tested by their cultural/moral side-channels. The reason I avoid them is that they suck -- they are mostly useless in their primary function. And a convenient / fortunate consequence thereof is that I don't get exposed to those side-channels.